Java make call to website and find all photos

Marcin :

So I'm about to create a project which basically makes an API call, then take data, look for photos and display for user as a slideshow.
I want to make an API call to National Geographic Photo Of The Day, and I have found National Geographic Photo Of The Day Archive and I want to make a call to that website, save somewhere all photos from that gallery and then let user decide if he likes photos or not. How can I approach my goal? For now I have only tried to establish connection with linked gallery

package javaapplication1;

import java.net.*;
import java.io.*;
import javax.imageio.ImageIO;

public class JavaApplication1 {

    public static void main(String[] args) throws Exception {
        URL natgeo = new URL("https://www.nationalgeographic.com/photography/photo-of-the-day/archive/");
        URLConnection yc = natgeo.openConnection();
        BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));

        String inputLine;
        while((inputLine = in.readLine()) != null)
            System.out.println(inputLine);
        in.close();

    }

}

and read in console output, but have no idea how to approach reading what came back as answer. I don't know if national geographic api exists, so don't know which approach would be better - finding API and make call for those photos or parse page and look for images and save them locally.
Appreciate all help!

Yserbius :

What you're trying to do is called "Web Scraping". You don't just have to make a connection with the gallery, you also have to parse the HTML and pull out the URL for the image, then download the image. I suggest you look into jsoup, a Java library built for this stuff. For image downloading a manipulation, the Java Image IO library has a lot of great functionality.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=27096&siteId=1