Main Tutorials

Java – How to download web page from internet?

A simple Java source code to download a web page from the internet.

JavaDownloadWebPage.java

package com.mkyong;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;

public class JavaDownloadWebPage {

    public static void main(String[] args) throws IOException {

        String result = downloadWebPage("https://mkyong.com");
        System.out.println(result);

    }

    private static String downloadWebPage(String url) throws IOException {

        StringBuilder result = new StringBuilder();
        String line;

        URLConnection urlConnection = new URL(url).openConnection();
        urlConnection.addRequestProperty("User-Agent", "Mozilla");
        urlConnection.setReadTimeout(5000);
        urlConnection.setConnectTimeout(5000);

        try (InputStream is = urlConnection.getInputStream();
             BufferedReader br = new BufferedReader(new InputStreamReader(is))) {

            while ((line = br.readLine()) != null) {
                result.append(line);
            }

        }

        return result.toString();

    }

}

Output


<!DOCTYPE html><html lang="en"><head><meta charset="utf-8" /><meta name="viewport" ... HTML source code

References

About Author

author image
Founder of Mkyong.com, love Java and open source stuff. Follow him on Twitter. If you like my tutorials, consider make a donation to these charities.

Comments

Subscribe
Notify of
2 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
DL Re
3 years ago

Hi MK,
what if the page you are trying to have reference links to images and CSS.. I want to download the complete web page programatically for one of my requirement in Java. Any suggestions

Robbie
3 years ago
Reply to  DL Re

Figured it out yet? I’m scouring Google for answers. I need to extract the JS libraries used by websites.