dongsiqie commited on
Commit
4bf8527
·
1 Parent(s): cd3c07c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +107 -1
README.md CHANGED
@@ -10,4 +10,110 @@ pinned: false
10
  license: mit
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  license: mit
11
  ---
12
 
13
+ # Local-Code-Interpreter
14
+ A local implementation of OpenAI's ChatGPT Code Interpreter.
15
+
16
+ ## Introduction
17
+
18
+ OpenAI's Code Interpreter plugin for ChatGPT is a revolutionary feature that allows the execution of Python code within the AI model. However, it execute code within an online sandbox and has certain limitations. In this project, we present Local Code Interpreter – which enables code execution on your local device, offering enhanced flexibility, security, and convenience.
19
+
20
+ ## Key Advantages
21
+
22
+ - **Custom Environment**: Execute code in a customized environment of your choice, ensuring you have the right packages and settings.
23
+
24
+ - **Seamless Experience**: Say goodbye to file size restrictions and internet issues while uploading. With Local Code Interpreter, you're in full control.
25
+
26
+ - **GPT-3.5 Availability**: While official Code Interpreter is only available for GPT-4 model, the Local Code Interpreter offers the flexibility to switch between both GPT-3.5 and GPT-4 models.
27
+
28
+ - **Enhanced Data Security**: Keep your data more secure by running code locally, minimizing data transfer over the internet.
29
+
30
+ ## Note
31
+ Executing AI-generated code without human review on your own device is not safe. You are responsible for taking measures to protect the security of your device and data (such as using a virtural machine) before launching this program. All consequences caused by using this program shall be borne by youself.
32
+
33
+ ## Usage
34
+
35
+ ### Getting Started
36
+
37
+ 1. Clone this repository to your local device
38
+ ```shell
39
+ git clone https://github.com/MrGreyfun/Local-Code-Interpreter.git
40
+ ```
41
+
42
+ 2. Install the necessary dependencies. The program has been tested on Windows 10 and CentOS Linux 7.8, with Python 3.9.16. Required packages include:
43
+ ```text
44
+ Jupyter Notebook 6.5.4
45
+ gradio 3.39.0
46
+ openai 0.27.8
47
+ ```
48
+ Other system or package version may also work.
49
+ ### Configuration
50
+
51
+ 1. Create a `config.json` file in the `src` directory, following the examples provided in the `config_example` directory.
52
+
53
+ 2. Configure your API key in the `config.json` file.
54
+
55
+ Please Note:
56
+ 1. **Set the `model_name` Correctly**
57
+ This program relies on the function calling capability of two specific models:
58
+ - `gpt-3.5-turbo-0613`
59
+ - `gpt-4-0613`
60
+
61
+ Older versions of the models will not work.
62
+
63
+ For Azure OpenAI service users:
64
+ - Set the `model_name` as your deployment name.
65
+ - Confirm that the deployed model corresponds to the `0613` version.
66
+
67
+ 2. **API Version Settings**
68
+ If you're using Azure OpenAI service, set the `API_VERSION` to `2023-07-01-preview` in the `config.json` file. Note that other API versions do not support the necessary function calls for this program.
69
+
70
+ 3. **Alternate API Key Handling**
71
+ If you prefer not to store your API key in the `config.json` file, you can opt for an alternate approach:
72
+ - Leave the `API_KEY` field in `config.json` as an empty string:
73
+ ```json
74
+ "API_KEY": ""
75
+ ```
76
+ - Set the environment variable `OPENAI_API_KEY` with your API key before running the program:
77
+ - On Windows:
78
+ ```shell
79
+ set OPENAI_API_KEY=<YOUR-API-KEY>
80
+ ```
81
+ - On Linux:
82
+ ```shell
83
+ export OPENAI_API_KEY=<YOUR-API-KEY>
84
+ ```
85
+
86
+ ## Getting Started
87
+
88
+ 1. Navigate to the `src` directory.
89
+
90
+ 2. Run the command:
91
+ ```shell
92
+ python web_ui.py
93
+ ```
94
+
95
+ 3. Access the generated link in your browser to start using the Local Code Interpreter.
96
+
97
+ ## Example
98
+
99
+ Imagine uploading a data file and requesting the model to perform linear regression and visualize the data. See how Local Code Interpreter provides a seamless experience:
100
+
101
+ 1. Upload the data and request linear regression:
102
+ ![Example 1](example_img/1.jpg)
103
+
104
+ 2. Encounter an error in the generated code:
105
+ ![Example 2](example_img/2.jpg)
106
+
107
+ 3. ChatGPT automatically checks the data structure and fixes the bug:
108
+ ![Example 3](example_img/3.jpg)
109
+
110
+ 4. The corrected code runs successfully:
111
+ ![Example 4](example_img/4.jpg)
112
+
113
+ 5. The final result meets your requirements:
114
+ ![Example 5](example_img/5.jpg)
115
+ ![Example 6](example_img/6.jpg)
116
+
117
+
118
+
119
+