Jupyter has been greatly upgraded, can interact with large models, and has been open source

Source | Heart of the Machine
Author | Chen Ping

Now the Large Language Model (LLM) is connected with Jupyter!

This is largely thanks to a project called Jupyter AI, which is an officially supported subproject of Project Jupyter. At present, the project is completely open source, and the connected models mainly come from major star companies and institutions such as AI21, Anthropic, AWS, Cohere, and OpenAI.

Large model research test portal

GPT-4 Portal (free of wall, can be tested directly, if you encounter browser warning point advanced/continue to visit):
Hello, GPT4!

With the blessing of the large model, the functions of Jupyter have also undergone great changes. Now you can generate code, summarize documentation, create comments, fix bugs, and more within that environment. You can even generate notebooks using text prompts.

The installation process of Jupyter AI is also very simple, the installation code is as follows:

pip install 'jupyter-ai>=1.0,<2.0' # If you use JupyterLab 3
pip install jupyter-ai # If you use JupyterLab 4

In addition, Jupyter AI provides two different interfaces to interact with LLM. In JupyterLab, you can use the chat interface to talk to the LLM to help with the code. Additionally, in any environment that supports notebooks or IPython, including JupyterLab, Notebook, IPython, Colab, and Visual Studio Code, you can invoke LLM using the %%ai magic command.

Jupyter under the blessing of large models

Next, let's see how it works.

programming assistant

The Jupyter chat interface is shown in the figure below, and users can have a conversation with Jupyternaut (programming assistant). In the Jupyternaut function bar, we can see this sentence "Hi everyone, I am Jupyternaut, your programming assistant. You can use the text box to ask me questions, or you can use the commands to ask me questions."

Next, the user asked Jupyternaut a question: "In Python, what is the difference between a tuple and a list?" Jupyternaut gave the key difference between the two, and the answer was very correct, and finally gave an example :

If there is a part of the code that you don't understand very well, you can select this part of the code and use it as a prompt, and then ask Jupyternaut to explain the code. In addition, Jupyternaut can also modify the code, identify code errors, etc.

If you are not satisfied with the code, you can also ask Jupyternaut to rewrite the code as required:

After rewriting the code, Jupyternaut will re-send the code back to the user-selected language model for replacement:

Generate notebook from text prompt

Jupyter AI's chat interface can generate a complete notebook based on a text prompt. To achieve this, the user needs to run the "/generate" command, plus a text description.

After Jupyternaut generates the notebook, it sends the user a message with the file name, which the user can open to view it:

access local files

You can use the "/learn" command to tell Jupyternaut to learn about local files, and then use the "/ask" command to ask questions about the local files. For example, using the "/learn" command, you can tell Jupyternaut to learn about the Jupyter AI documentation:

Once Jupyternaut is finished, you can use the "/ask" command to ask questions:

magic function

Jupyter AI also provides %%ai commands that can be run in notebook cells and the IPython command line interface. Each %%ai command requires a model, usually specified as provider‑id:model‑id:

Another researcher experienced the %%ai magic command to call ChatGPT:

Additionally, you can customize the format of the output, including HTML, math, source code, and images, using the -f or --format parameter, which is useful for researchers and educators.

After some demonstrations, Jupyter with the blessing of a large model is indeed a lot more convenient. Friends who want to try, you can go and have a try.

Guess you like

Origin blog.csdn.net/xixiaoyaoww/article/details/132110833