Git Product home page Git Product logo

readme-ai's Introduction

๐ŸŒŠย ๐ŸŒดย ๐Ÿฐย ๐ŸŽ๏ธ๐Ÿ’จย ๐Ÿ›ซย ๐ŸŒŽย ๐ŸŒย ๐ŸŒย ๐Ÿ›ฌ

๐Ÿ“ I enjoy building automated tools and learning new things.

Open-Source Projects Role
๐Ÿš€ readme-ai ๐Ÿง‘โ€๐Ÿ’ป Maintainer
๐ŸŽˆ readme-ai-streamlit ๐Ÿง‘โ€๐Ÿ’ป Maintainer
๐Ÿค– openai-cookbook ๐Ÿค Contributor
GitHub Activity

readme-ai's People

Contributors

dependabot[bot] avatar eli64s avatar hansipie avatar zxilly avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

readme-ai's Issues

Feature Request: GitLab Support

Hey everyone,

I've been playing around with README-AI and I'm seriously impressed. The quality of READMEs this tool generates is just awesome, and you can tell the GPT-powered magic does its job well. Props to the team behind this! ๐Ÿ™Œ

However, I noticed that the tool only supports GitHub repositories at the moment. As a GitLab user (and I'm sure there are others out there), it would be fantastic to have GitLab support built into README-AI. The more, the merrier, right?

I'm guessing that because GitHub and GitLab share a lot in common, it might not be a massive leap to include this feature. Imagine having the power to auto-generate beautiful READMEs for GitLab repositories, or easily transfer your killer READMEs from GitHub to GitLab (and back!). That'd be a game changer.

So, that's my two cents. Anybody else out there think GitLab support would be a cool addition?

Oh, and by the way, this message was written with a little help from a friend - ChatGPT, to be exact. Kinda meta, right?

Stacktrace: Traceback (most recent call last):

Hi as title states:

readmeai -o=README.md --repository=https://github.com/Acetyld/expo-foreground-actions

ERROR Exception: list index out of range Stacktrace: Traceback (most recent call last): File "/opt/homebrew/lib/python3.11/site-packages/readmeai/main.py", line 81, in readme_agent dependencies, file_text = parser.get_dependencies(temp_dir) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/readmeai/core/preprocess.py", line 107, in get_dependencies dependencies = self.get_dependency_file_contents(contents) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/readmeai/core/preprocess.py", line 64, in get_dependency_file_contents parsed_content = parser(content=content["content"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/readmeai/core/parser.py", line 254, in parse_gradle return [ ^ File "/opt/homebrew/lib/python3.11/site-packages/readmeai/core/parser.py", line 255, in <listcomp> match.split(":")[-2].split(".")[-1] ~~~~~~~~~~~~~~~~^^^^ IndexError: list index out of range

parser = file_parsers[content["name"]] KeyError: '<filename.ext>'

I correctly generated a readme with one of my personal project, but just for the sake, I tried also to do the same on one of my work projects and I get this stacktrace:

docker run -it -e OPENAI_API_KEY=<API_KEY> -v $(pwd):/app zeroxeli/readme-ai:latest readmeai -o README.md -r /app
INFO     README-AI is now executing.
INFO     Successfully validated OpenAI API key.
INFO     Model: {'endpoint': 'https://api.openai.com/v1/chat/completions', 'engine': 'gpt-3.5-turbo', 'encoding': 'cl100k_base', 'rate_limit': 5, 'tokens': 669, 'tokens_max': 3899, 'temperature': 1.2, 'api_key': '****************'}
INFO     Repository: GitConfig(repository='/app', name='app')
INFO     Successfully cloned /app to /tmp/tmp2yyavsyd.
INFO     Dependency file found: package.json
INFO     Dependency file found: build.gradle
INFO     Dependency file found: documentiReportRiepilogo.model.ts
Traceback (most recent call last):
  File "/home/tempuser/.local/bin/readmeai", line 8, in <module>
    sys.exit(cli())
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/main.py", line 131, in cli
    asyncio.run(main(api_key, output, repository))
  File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
    return future.result()
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/main.py", line 28, in main
    await generate_readme(llm)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/main.py", line 35, in generate_readme
    dependencies, file_text = get_dependencies(scanner, repository)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/main.py", line 89, in get_dependencies
    dependencies, file_text = scanner.get_dependencies(repository)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/preprocess.py", line 34, in get_dependencies
    dependencies = self.parser.get_dependency_file_contents(contents)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/preprocess.py", line 90, in get_dependency_file_contents
    parser = file_parsers[content["name"]]
KeyError: 'documentiReportRiepilogo.model.ts'

just for the sake of it here is the file:

import type { DocType } from './doc-type.model';

export interface ReportRiepilogoModelTable {
  id: number;
  nomeDocumento: string;
  dataPubblicazione?: string;
  destinatari: number;
  accettato: number;
  rifiutato: number;
  nonRisp: number;
  download: number;
}

export interface ReportRiepilogoModel {
  id: number;
  fileName: string;
  docTypeId: number;
  docType: DocType;
  linkedDocuments: unknown[];
  statoDocumento: string;
  read: boolean;
  proposalCount: number;
  publishDate: string;
  creationDate: string;
  countAccettato: number;
  countDownload: number;
  countNonRisposto: number;
  countRifiutato: number;
}

Error when using setup using bash

Processing fstab with mount -a failed.

<3>WSL (8) ERROR: CreateProcessEntryCommon:370: getpwuid(0) failed 2
<3>WSL (8) ERROR: CreateProcessEntryCommon:374: getpwuid(0) failed 2

<3>WSL (8) ERROR: CreateProcessEntryCommon:577: execvpe /bin/bash failed 2
<3>WSL (8) ERROR: CreateProcessEntryCommon:586: Create process not expected to return

Error and when using docker to pull, it finishes but when running the python command it says

C:\Users\Ahmed\Desktop\New folder\readme-ai>python src/main.py --api-key "API" --output readme-ai.md --repository https://github.com/AhmedMohammedMostafa/Reverse-Image-Search
Traceback (most recent call last):
  File "C:\Users\Ahmed\Desktop\New folder\readme-ai\src\main.py", line 8, in <module>
    import conf
  File "C:\Users\Ahmed\Desktop\New folder\readme-ai\src\conf.py", line 13, in <module>
    from factory import FileHandler
  File "C:\Users\Ahmed\Desktop\New folder\readme-ai\src\factory.py", line 6, in <module>
    import yaml
ModuleNotFoundError: No module named 'yaml'

problem with Bitbucket private repository

Hello everyone, I'm trying to generate a README file using the Docker option:

docker run -it \
 -e OPENAI_API_KEY=$OPENAI_API_KEY \
 -v "$(pwd)":/app zeroxeli/readme-ai:latest \
readmeai -o readme-ai.md -r https://user:[email protected]/devs/ms-test 

However, I'm encountering the following error:


Traceback (most recent call last):
  File "/home/tempuser/.local/bin/readmeai", line 8, in <module>
    sys.exit(commands())
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/tempuser/.local/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/cli/commands.py", line 35, in commands
    main(
  File "/home/tempuser/.local/lib/python3.9/site-packages/readmeai/main.py", line 38, in main
    conf.git = GitConfig(repository=repository)
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 3 validation errors for GitConfig
repository
  Invalid repository URL or path: https://user:[email protected]/devs/ms-test  (type=value_error)
source
  expected str, bytes or os.PathLike object, not NoneType (type=type_error)
name
  expected str, bytes or os.PathLike object, not NoneType (type=type_error)

If I use a public GitHub repository, I don't have any problem.

My question is, can I use a private repository, or is there something wrong with the parameters I'm passing? Any ideas about what might be causing this error?

HTTPStatus Expception: Client error '400 Bad Request'

Getting that error after Processing all files prompts, here is a snippet:

`INFO

Processing prompt: src\utils\helpers.js

Response: ...

ERROR HTTPStatus Exception:
Client error '400 Bad Request' for url 'https://api.openai.com/v1/chat/completions'
For more information check: https://httpstatuses.com/400

ERROR HTTPStatus Exception:
Client error '400 Bad Request' for url 'https://api.openai.com/v1/chat/completions'
For more information check: https://httpstatuses.com/400`

Im not used to python, so tell me if you need more info about the error, awesome work by the way.

errorReadmeai

In the generated README.md

readmeError

ERROR HTTPStatus Exception

When using the provided example command:

docker run -it \ -e OPENAI_API_KEY=xxx \ -v "$(pwd)":/app zeroxeli/readme-ai:latest \ -r https://github.com/eli64s/readme-ai

I keep getting this error:

ERROR HTTPStatus Exception: Client error '404 Not Found' for url 'https://api.openai.com/v1/chat/completions' For more information check: https://httpstatuses.com/404

Tried with both free and paid API key, as well as the PIP install and the docker one. All result in same error.

How to process private repositories or local codebase

Hi !

This project seems really cool and I'd love to try it on some of my repos.
However, those repos are private and can only be cloned via SSH (git clone [email protected]:<owner>/<repo>)

From the documentation, I was under the impression it is possible to provide the local path to the codebase in order to generate the README :

By providing a remote repository URL or path to your codebase [...]
[...]
-r or --repository: The URL or path to your code repository.

My issue is that the command doesn't seem to accept either git repositories in the [email protected]:<OWNER>/<REPO> format or local path to a git repository as in ~/projects/my-awesome-project

When using local path to codebase :

readmeai -k '<TOKEN>' -r ~/projects/my-awesome-project -o /tmp/readmai.md
[...]
File "/home/shampu/.local/lib/python3.11/site-packages/readmeai/main.py", line 23, in main
    config.git = conf.GitConfig(repository=repository)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "pydantic/dataclasses.py", line 286, in pydantic.dataclasses._add_pydantic_validation_attributes.handle_extra_init

  File "<string>", line 5, in __init__
  File "pydantic/dataclasses.py", line 305, in pydantic.dataclasses._add_pydantic_validation_attributes.new_post_init
    return ('Field('
  File "/home/shampu/.local/lib/python3.11/site-packages/readmeai/conf.py", line 79, in __post_init__
    raise ValueError(f"Ivalid repository URL or path: {self.repository}")
ValueError: Ivalid repository URL or path: /home/shampu/projects/my-awesome-project

When providing path to private repo :

readmeai -k '<TOKEN>' -r '[email protected]:Shampu/MyAwesomeProject.git' -o /tmp/readmai.md
[...]
File "/home/shampu/.local/lib/python3.11/site-packages/readmeai/main.py", line 23, in main
    config.git = conf.GitConfig(repository=repository)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "pydantic/dataclasses.py", line 286, in pydantic.dataclasses._add_pydantic_validation_attributes.handle_extra_init

  File "<string>", line 5, in __init__
  File "pydantic/dataclasses.py", line 305, in pydantic.dataclasses._add_pydantic_validation_attributes.new_post_init
    return ('Field('
  File "/home/shampu/.local/lib/python3.11/site-packages/readmeai/conf.py", line 79, in __post_init__
    raise ValueError(f"Ivalid repository URL or path: {self.repository}")
ValueError: Ivalid repository URL or path: [email protected]:Shampu/MyAwesomeProject.git

I would love for anyone to enlighten me regarding the use of SSH cloning and local codebase for this cool project

Error generating readme: [WinError 5] Access is denied

OS: Windows 10
I have tried running the shell as administrator and it still gives the exact same error. Is there anything else I could try?

PS D:\dev\ti\projects\PROJECTNAME> readmeai -r . --api GEMINI -o ai.md -m gemini-pro
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/blacklist.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/commands.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/languages.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/markdown.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/parsers.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.config.settings | Loaded configuration file: settings/prompts.toml
โ–บ INFO | 2024-03-21 10:50:40 | readmeai.core.utils | GEMINI settings FOUND in environment!
โ–บ INFO | 2024-03-21 10:50:40 | readmeai._agent | Repository validated: repository='.' full_name='' host_domain='local' host='local' name='.'
โ–บ INFO | 2024-03-21 10:50:40 | readmeai._agent | LLM API settings: api='GEMINI' base_url='https://api.openai.com/v1/chat/completions' context_window=3999 encoder='cl100k_base' model='gemini-pro' temperature=0.9 tokens=650 top_p=0.9
Traceback (most recent call last):
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\_agent.py", line 77, in readme_agent
    asyncio.run(readme_generator(conf, output_file))
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\asyncio\runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\asyncio\base_events.py", line 684, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\_agent.py", line 86, in readme_generator
    await clone_repository(conf.config.git.repository, temp_dir)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\services\git.py", line 83, in clone_repository
    await remove_hidden_contents(temp_dir_path)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\services\git.py", line 100, in remove_hidden_contents
    shutil.rmtree(item)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\shutil.py", line 808, in rmtree
    return _rmtree_unsafe(path, onexc)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\shutil.py", line 631, in _rmtree_unsafe
    _rmtree_unsafe(fullname, onexc)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\shutil.py", line 631, in _rmtree_unsafe
    _rmtree_unsafe(fullname, onexc)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\shutil.py", line 636, in _rmtree_unsafe
    onexc(os.unlink, fullname, err)
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\shutil.py", line 634, in _rmtree_unsafe
    os.unlink(fullname)
PermissionError: [WinError 5] Access is denied: 'D:\\Users\\USERNAME\\AppData\\Local\\Temp\\tmp760n5qjo\\.git\\objects\\pack\\pack-86f33272c58b623035530fab3cbf693de6534b94.idx'  

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Scripts\readmeai.exe\__main__.py", line 7, in <module>
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\click\core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\click\core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\click\core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\click\core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\cli\main.py", line 51, in main
    readme_agent(
  File "D:\Users\USERNAME\AppData\Local\Programs\Python\Python312\Lib\site-packages\readmeai\_agent.py", line 80, in readme_agent
    raise ReadmeGeneratorError(exc, traceback.format_exc()) from exc
readmeai._exceptions.ReadmeGeneratorError: ("Error generating readme: [WinError 5] Access is denied: 'D:\\\\Users\\\\USERNAME\\\\AppData\\\\Local\\\\Temp\\\\tmp760n5qjo\\\\.git\\\\objects\\\\pack\\\\pack-86f33272c58b623035530fab3cbf693de6534b94.idx'", 'Traceback (most recent call last):\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\site-packages\\readmeai\\_agent.py", line 77, in readme_agent\n    asyncio.run(readme_generator(conf, output_file))\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\asyncio\\runners.py", line 194, in run\n    return runner.run(main)\n           ^^^^^^^^^^^^^^^^\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\asyncio\\runners.py", line 118, in run\n    return self._loop.run_until_complete(task)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\asyncio\\base_events.py", line 684, in run_until_complete\n    return future.result()\n           ^^^^^^^^^^^^^^^\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\site-packages\\readmeai\\_agent.py", line 86, in readme_generator\n    await clone_repository(conf.config.git.repository, temp_dir)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\site-packages\\readmeai\\services\\git.py", line 83, in clone_repository\n    await remove_hidden_contents(temp_dir_path)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\site-packages\\readmeai\\services\\git.py", line 100, in remove_hidden_contents\n    shutil.rmtree(item)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\shutil.py", line 808, in rmtree\n    return _rmtree_unsafe(path, onexc)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\shutil.py", line 631, in _rmtree_unsafe\n    _rmtree_unsafe(fullname, onexc)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\shutil.py", line 631, in _rmtree_unsafe\n    _rmtree_unsafe(fullname, onexc)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\shutil.py", line 636, in _rmtree_unsafe\n    onexc(os.unlink, fullname, err)\n  File "D:\\Users\\USERNAME\\AppData\\Local\\Programs\\Python\\Python312\\Lib\\shutil.py", line 634, in _rmtree_unsafe\n    os.unlink(fullname)\nPermissionError: [WinError 5] Access is denied: \'D:\\\\Users\\\\USERNAME\\\\AppData\\\\Local\\\\Temp\\\\tmp760n5qjo\\\\.git\\\\objects\\\\pack\\\\pack-86f33272c58b623035530fab3cbf693de6534b94.idx\'\n')

Unable to generate the readme file

Whenever I try using this via Docker, it says it works and stuff, but the file never generates. It only generates a broken-named folder without anything in it.

This is the output of the CLI:

INFO     Total files: 14
WARNING  Ignoring file: package-lock.json
WARNING  Ignoring file: .gitignore
WARNING  Ignoring file: package.json
WARNING  Ignoring file: .eslintrc.json
INFO
Processing prompt: src/routes/getAllPrompts.ts
Response: This code defines a route to get all prompts using the Fastify framework. It retrieves a list of prompts from the Prisma ORM and returns it as a response.INFO
Processing prompt: src/server.ts
Response: This code sets up a HTTP server with Fastify framework and enables CORS. It also registers multiple routes for handling prompt retrieval, video upload, transcription creation, and AI completion generation. The server listens on port 3333.
INFO
Processing prompt: src/routes/uploadVideo.ts
Response: This code provides an endpoint for uploading MP3 video files. It uses Fastify and @fastify/multipart for file handling, prisma for database operations, and node modules for file manipulation and UUID generation. The code validates the file type and size, saves the file to a temporary directory, and creates a corresponding entry in the database. Maximum file size limit is set to 500MB.
INFO
Processing prompt: src/routes/createTranscription.ts
Response: This code defines a route for creating transcriptions of videos in a Fastify server. It uses Zod for validation, Prisma for database access, and OpenAI's model to generate transcriptions from audio files. Transcriptions are stored in the database and returned as a response. Total characters: 323.
INFO
Processing prompt: routes.http
Response: The code provides four main functionalities: 1.'get-prompts' retrieves a list of prompts from a local server.2.'upload' allows users to upload a video file to the server.3.'create-transcription' generates a transcription for a specific video by providing a prompt.4.'generate-ai-completion' uses artificial intelligence to generate a concise summary of the video's transcription based on the given prompt.
INFO
Processing prompt: src/lib/openai.ts
Response: This code initializes and sets up the OpenAI client by importing the necessary packages and creating a new instance of the client using an API key provided through the environment variable.
INFO
Processing prompt: src/routes/generateAICompletion.ts
Response: This code defines an API route for generating AI completions. It validates the input, retrieves a video from a database, generates a prompt message, sends it to OpenAI's chat completions API, and streams the response to the client.
INFO
Processing prompt: src/lib/prisma.ts
Response: The code imports and initializes the Prisma client, which communicates with the database. It offers functions to interact with database tables, perform CRUD operations, and manage the data.
INFO
Processing prompt: prisma/seed.ts
Response: This code utilizes the Prisma ORM to create and manage prompts for YouTube videos. It deletes existing prompts and creates new ones with defined templates. Users can generate catchy video titles and concise descriptions with hashtags based on video transcriptions.
INFO
Processing prompt: prisma/schema.prisma
Response: The code sets up a generator client and sets the Prisma client as the provider. It also sets up a SQLite datasource using the specified URL. Two models are defined: Video with various fields and Prompt with title and template fields.
INFO
Processing prompt: 1
Response: Empowering AI with NLW precision!
INFO
Processing prompt: 2
Response: This project provides a server that enables users to upload and transcribe video files using artificial intelligence. It offers four main functionalities: retrieving prompts from a local server, uploading video files to the server, creating transcriptions based on the provided prompt, and generating concise summaries of video transcriptions using AI completion. The project's value proposition lies in automating the process of transcribing videos, saving time and effort for users while also providing comprehensive and accurate transcriptions.
INFO
Processing prompt: 3
Response: | Feature                | Description
                                                                              |
| ---------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **โš™๏ธ Architecture**     | The system follows a server-client architecture, with a HTTP server implemented using the Fastify framework. The modular setup enables easy exteensibility and separation of concerns.                                   |
| **๐Ÿ“– Documentation**    | The codebase lacks detailed documentation. Some files have short comments, but overall, there could be a better explanation of function usage, file structure, and external dependencies.                              |
| **๐Ÿ”— Dependencies**     | The system relies on Fastify, Prisma, Zod, and OpenAI packages. Fastify is used to handle HTTP requests, Prisma provides database access, Zod adds input validation, and OpenAI powers the AI completion feature.           |
| **๐Ÿงฉ Modularity**       | Modularity is achieved by separating different functionalities into separate route files. Each route handles a specific part of the application and is enabled through the main server file.                                |
| **โœ”๏ธ Testing**          | The codebase does not include any testing strategies or tools. Adding unit tests, integration tests, and end-to-end tests would greatly enhance  code reliability and ensure the proper functioning of the system.         |
| **โšก๏ธ Performance**      | The system's performance can be improved by implementing request/response caching, compressing data during transmission, and optimizing databasse queries. Proper code profiling and benchmarking would provide more insights. |
| **๐Ÿ” Security**         | The system needs to enhance security measures. Validating user inputs more rigorously, safeguarding API keys/secrets, implementing secure data storage practices, and using HTTPS would improve the overall security posture.     |
| **๐Ÿ”€ Version Control**  | Git is used as the version control system for this project. The codebase takes advantage of Git's branch management, commit history, and code collaboration features to facilitate development and maintain code quality.        |
| **๐Ÿ”Œ Integrations**     | The system integrates with the OpenAI API to power the AI completion feature. The integration is well handled with dedicated code in the `src/lib/openai.ts` file, abstracting API interactions from the rest of the codebase.     |
| **๐Ÿ“ถ Scalability**      | The system's scalability could be improved by exploring load balancing techniques, utilizing queues to offload tasks, and implementing caching mechanisms. Proper horizontal scaling and resource optimization strategies are crucial.   |
INFO     Successfully cloned https://github.com/mattpsvreis/nlw-ia-server to /tmp/tmp5j0vr2dq/repo.
WARNING  Exception creating repository tree structure: Error: file permissions of cloned repository must be set to 0o700.
INFO     Top language: Typescript (.ts)
INFO     TypeScript setup guide: ['npm install', 'npm run build && node dist/main.js', 'npm test']
INFO     README file generated at: /app/readme-ai.md
INFO     README-AI execution complete.

This is how it looks in the folders:

image

I don't know what I'm doing wrong.

TikToken Special Character Conflict

When running in basic JS application, I'm getting this error:

ERROR    [1:logger] [2024-01-29 15:51:28,449] Error in token encoding: Encountered text corresponding to disallowed special token '<|endoftext|>'.
If you want this text to be encoded as a special token, pass it to `allowed_special`, e.g. `allowed_special={'<|endoftext|>', ...}`.
If you want this text to be encoded as normal text, disable the check for this token by passing `disallowed_special=(enc.special_tokens_set - {'<|endoftext|>'})`.
To disable this check for all special tokens, pass `disallowed_special=()`.

Related to: langchain-ai/langchain#923

Error when using GEMINI

Not sure if is because of GEMINI or using windows or for anything else:
anyone had this issue before:

(readmeai) C:\Users\BrahianVT\Desktop\python\readme-ai>python -m readmeai.cli.main --api GEMINI -r https://github.com/eli64s/readme-ai
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/blacklist.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/commands.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/languages.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/markdown.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/parsers.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.config.settings | Loaded configuration file: settings/prompts.toml
โ–บ INFO | 2024-03-15 14:21:09 | readmeai.core.utils | GEMINI settings FOUND in environment!
โ–บ INFO | 2024-03-15 14:21:09 | readmeai._agent | Repository validated: repository='https://github.com/eli64s/readme-ai' full_name='eli64s/readme-ai' host_domain='github.com' host='github' name='readme-ai'
โ–บ INFO | 2024-03-15 14:21:09 | readmeai._agent | LLM API settings: api='GEMINI' base_url='https://api.openai.com/v1/chat/completions' context_window=3999 encoder='cl100k_base' model='gpt-3.5-turbo' temperature=0.9 tokens=650 top_p=0.9
C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\shutil.py:648: RuntimeWarning: coroutine 'handleRemoveReadonly' was never awaited
  onexc(os.unlink, fullname, err)
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\shutil.py:652: RuntimeWarning: coroutine 'handleRemoveReadonly' was never awaited
  onexc(os.rmdir, path, err)
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
โ–บ INFO | 2024-03-15 14:21:14 | readmeai.core.preprocess | Dependency file found: Dockerfile:
[('python', '3.10-slim-buster')]
โ–บ INFO | 2024-03-15 14:21:14 | readmeai.core.preprocess | Dependency file found: pyproject.toml:
['python', 'aiohttp', 'click', 'gitpython', 'google-generativeai', 'openai', 'pydantic', 'pyyaml', 'tenacity', 'tiktoken', 'toml', 'ruff', 'pytest', 'pytest-asyncio', 'pytest-cov', 'pytest-randomly', 'pytest-sugar', 'pytest-xdist', 'mkdocs', 'mkdocs-material']
โ–บ INFO | 2024-03-15 14:21:14 | readmeai.core.preprocess | Dependency file found: environment.yaml:
['python', 'pip', '-r requirements.txt']
โ–บ INFO | 2024-03-15 14:21:14 | readmeai.core.preprocess | Dependency file found: requirements.txt:
['aiohttp', 'aiosignal', 'anyio', 'async-timeout', 'attrs', 'cachetools', 'certifi', 'charset-normalizer', 'click', 'colorama', 'distro', 'exceptiongroup', 'frozenlist', 'gitdb', 'gitpython', 'google-ai-generativelanguage', 'google-api-core', 'google-api-core', 'google-auth', 'google-generativeai', 'googleapis-common-protos', 'grpcio-status', 'grpcio', 'h11', 'httpcore', 'httpx', 'idna', 'multidict', 'openai', 'proto-plus', 'protobuf', 'pyasn1-modules', 'pyasn1', 'pydantic', 'pyyaml', 'regex', 'requests', 'rsa', 'smmap', 'sniffio', 'tenacity', 'tiktoken', 'toml', 'tqdm', 'typing-extensions', 'urllib3', 'yarl']
โ–บ INFO | 2024-03-15 14:21:14 | readmeai.core.preprocess | Dependencies: {'Dockerfile': [('python', '3.10-slim-buster')], 'Makefile': [], 'poetry.lock': [], 'pyproject.toml': ['python', 'aiohttp', 'click', 'gitpython', 'google-generativeai', 'openai', 'pydantic', 'pyyaml', 'tenacity', 'tiktoken', 'toml', 'ruff', 'pytest', 'pytest-asyncio', 'pytest-cov', 'pytest-randomly', 'pytest-sugar', 'pytest-xdist', 'mkdocs', 'mkdocs-material'], 'environment.yaml': ['python', 'pip', '-r requirements.txt'], 'requirements.txt': ['aiohttp', 'aiosignal', 'anyio', 'async-timeout', 'attrs', 'cachetools', 'certifi', 'charset-normalizer', 'click', 'colorama', 'distro', 'exceptiongroup', 'frozenlist', 'gitdb', 'gitpython', 'google-ai-generativelanguage', 'google-api-core', 'google-api-core', 'google-auth', 'google-generativeai', 'googleapis-common-protos', 'grpcio-status', 'grpcio', 'h11', 'httpcore', 'httpx', 'idna', 'multidict', 'openai', 'proto-plus', 'protobuf', 'pyasn1-modules', 'pyasn1', 'pydantic', 'pyyaml', 'regex', 'requests', 'rsa', 'smmap', 'sniffio', 'tenacity', 'tiktoken', 'toml', 'tqdm', 'typing-extensions', 'urllib3', 'yarl']}
โ–บ INFO | 2024-03-15 14:21:14 | readmeai._agent | Total files analyzed: 80
โ–บ INFO | 2024-03-15 14:21:14 | readmeai._agent | Dependencies found: ['', 'lock', 'google-auth', 'tenacity', 'smmap', 'yaml', 'pip', 'gitdb', 'multidict', 'toml', 'Makefile', 'exceptiongroup', 'colorama', 'tiktoken', 'Dockerfile', 'google-api-core', 'anyio', 'charset-normalizer', 'idna', 'urllib3', 'async-timeout', 'pyyaml', 'certifi', 'environment.yaml', 'python', 'httpcore', 'mkdocs-material', 'sniffio', 'googleapis-common-protos', 'cachetools', 'openai', 'h11', 'protobuf', 'google-ai-generativelanguage', 'proto-plus', 'pytest', 'gitpython', 'shell', 'distro', 'poetry.lock', 'ruff', 'grpcio-status', 'pyasn1', 'pyproject.toml', 'txt', 'httpx', 'requests', 'frozenlist', 'tqdm', 'regex', '-r requirements.txt', 'typing-extensions', 'mkdocs', 'rsa', 'py', 'google-generativeai', 'pyasn1-modules', 'requirements.txt', 'pytest-xdist', 'pytest-asyncio', 'pydantic', ('python', '3.10-slim-buster'), 'sh', 'aiosignal', 'text', 'pytest-randomly', 'yarl', 'yml', 'pytest-sugar', 'grpcio', 'attrs', 'click', 'pytest-cov', 'aiohttp']
404 models/gpt-3.5-turbo is not found for API version v1beta, or is not supported for GenerateContent. Call ListModels to see the list of available models and their supported methods.
Traceback (most recent call last):
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\_agent.py", line 77, in readme_agent
    asyncio.run(readme_generator(conf, output_file))
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\asyncio\runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\asyncio\base_events.py", line 685, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\_agent.py", line 122, in readme_generator
    ).build()
      ^^^^^^^
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\generators\builder.py", line 121, in build
    self.md_summaries,
    ^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\generators\builder.py", line 65, in md_summaries
    summaries = tables.format_code_summaries(
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\generators\tables.py", line 59, in format_code_summaries
    for summary in code_summaries:
TypeError: 'NotFound' object is not iterable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\cli\main.py", line 73, in <module>
    main()
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\site-packages\click\core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\site-packages\click\core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\site-packages\click\core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\miniconda3\envs\readmeai\Lib\site-packages\click\core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\cli\main.py", line 51, in main
    readme_agent(
  File "C:\Users\BrahianVT\Desktop\python\readme-ai\readmeai\_agent.py", line 80, in readme_agent
    raise ReadmeGeneratorError(exc, traceback.format_exc()) from exc
readmeai._exceptions.ReadmeGeneratorError: ("Error generating readme: 'NotFound' object is not iterable", 'Traceback (most recent call last):\n  File "C:\\Users\\BrahianVT\\Desktop\\python\\readme-ai\\readmeai\\_agent.py", line 77, in readme_agent\n    asyncio.run(readme_generator(conf, output_file))\n  File "C:\\Users\\BrahianVT\\miniconda3\\envs\\readmeai\\Lib\\asyncio\\runners.py", line 194, in run\n    return runner.run(main)\n           ^^^^^^^^^^^^^^^^\n  File "C:\\Users\\BrahianVT\\miniconda3\\envs\\readmeai\\Lib\\asyncio\\runners.py", line 118, in run\n    return self._loop.run_until_complete(task)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "C:\\Users\\BrahianVT\\miniconda3\\envs\\readmeai\\Lib\\asyncio\\base_events.py", line 685, in run_until_complete\n    return future.result()\n           ^^^^^^^^^^^^^^^\n  File "C:\\Users\\BrahianVT\\Desktop\\python\\readme-ai\\readmeai\\_agent.py", line 122, in readme_generator\n    ).build()\n      ^^^^^^^\n  File "C:\\Users\\BrahianVT\\Desktop\\python\\readme-ai\\readmeai\\generators\\builder.py", line 121, in build\n    self.md_summaries,\n    ^^^^^^^^^^^^^^^^^\n  File "C:\\Users\\BrahianVT\\Desktop\\python\\readme-ai\\readmeai\\generators\\builder.py", line 65, in md_summaries\n    summaries = tables.format_code_summaries(\n                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "C:\\Users\\BrahianVT\\Desktop\\python\\readme-ai\\readmeai\\generators\\tables.py", line 59, in format_code_summaries\n    for summary in code_summaries:\nTypeError: \'NotFound\' object is not iterable\n')

Running README-AI locally without GitHub repository URL

Description

I'm trying to run README-AI locally on my machine without a GitHub repository, but I'm facing an issue with the configuration file that requires me to provide a GitHub repository URL. I would like to use a Git repository on my machine instead.

Steps to reproduce

  1. Clone the README-AI repository on your machine
  2. Navigate to the cloned repository directory
  3. Run the README-AI command with the --local flag to specify the local repository path, for example:
readme-ai --local /path/to/local/repo

Expected behavior

README-AI should generate a README.md file based on the local repository and without requiring a GitHub repository URL.

Actual behavior

README-AI displays an error message indicating that the GitHub repository URL is missing in the configuration file.

Readme for repos with lots of data

Hi! Before running the script, I want to make sure that it isn't going to send non-code files to GPT.

In my ./SRC for the intented repo, I have .py, .csv, and .png files. I also have lots of data (17k+) in json files in my ./dat . Will the script send over the json file contents to gpt to process? How do I make sure the jsons are excluded?

package manager

Would you be willing to accept a new package manager? I found it much easier to use pipenv or poetry in half the lines of code for the setup. If so I'll switch the package manager for you. Love the project just got to wait for chat-gpt to accept more tokens. 4000 is rather small.

requirements.txt has to specify versions

The application requirements.txt does not have version pinning. Please update this . I'm currently unable to get your application to run with a paid API key, and perhaps because of a lack of version pinning.

Authentication failed to github.com

Running full command:
python src/main.py -k KEY -o docs/README_X.md -u GITHUB_URL and receiving the following error after prompted for username/password:

remote: Support for password authentication was removed on August 13, 2021.
remote: Please see https://docs.github.com/en/get-started/getting-started-with-git/about-remote-repositories#cloning-with-https-urls for information on currently recommended modes of authentication.
fatal: Authentication failed for 'GITHUB_URL'

KEY and GITHUB_URL were of course fully written out.
Thanks for what looks like an awesome project!

ModuleNotFoundError: No module named 'openai'

(base) root@template:~/readme-ai# conda activate readmeai
(readmeai) root@template:~/readme-ai# export OPENAI_API_KEY="sk-blablablabla"
(readmeai) root@template:~/readme-ai# python src/main.py -o readme-ai.md -r https://github.com/johnfelipe/revisionmanualbasadaensonarconreportingtool
Traceback (most recent call last):
  File "/root/readme-ai/src/main.py", line 8, in <module>
    import conf
  File "/root/readme-ai/src/conf.py", line 10, in <module>
    import openai
ModuleNotFoundError: No module named 'openai'
(readmeai) root@template:~/readme-ai# ls
CHANGELOG.md        Dockerfile  poetry.lock       scripts   tests
CODE_OF_CONDUCT.md  examples    pyproject.toml    setup
conf                LICENSE     README.md         setup.py
CONTRIBUTING.md     Makefile    requirements.txt  src
(readmeai) root@template:~/readme-ai# python src/main.py -o readme-ai.md -r https://github.com/eli64s/readme-ai
Traceback (most recent call last):
  File "/root/readme-ai/src/main.py", line 8, in <module>
    import conf
  File "/root/readme-ai/src/conf.py", line 10, in <module>
    import openai
ModuleNotFoundError: No module named 'openai'
(readmeai) root@template:~/readme-ai# python src/main.py --api-key "sk-blablablabla" --output readme-ai.md --repository https://github.com/eli64s/readme-ai
Traceback (most recent call last):
  File "/root/readme-ai/src/main.py", line 8, in <module>
    import conf
  File "/root/readme-ai/src/conf.py", line 10, in <module>
    import openai
ModuleNotFoundError: No module named 'openai'

how fix this?

UnboundLocalError: cannot access local variable 'temp_dir' where it is not associated with a value

Hi,
I'm trying to run the readme-ai on both a github and gitlab repos without any succsess.

I am using python 3.11.4 using conda as venv (tried also with other venvs).
I've installed using pip install readmeai

this is the error I get:

INFO     README-AI is now executing.                                                                                                                                                    
INFO     Repository: https://gitlab.com/corractions/datalake.git
INFO     Model:  gpt-3.5-turbo
INFO     Output: readme-ai.md
ERROR    Exception: Error cloning git repository: [WinError 5] Access is denied: 'C:\\Users\\yuval\\AppData\\Local\\Temp\\tmpvmjhqr25\\.git\\objects\\pack\\pack-48b911d1bf9b980873e1d4b0cc2e68b52e19f74b.idx'
ERROR    Stacktrace: Traceback (most recent call last):
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\utils\github.py", line 32, in clone_repo_to_temp_dir
    shutil.rmtree(git_dir)
  File "C:\Users\yuval\anaconda3\Lib\shutil.py", line 759, in rmtree
    return _rmtree_unsafe(path, onerror)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\shutil.py", line 617, in _rmtree_unsafe
    _rmtree_unsafe(fullname, onerror)
  File "C:\Users\yuval\anaconda3\Lib\shutil.py", line 617, in _rmtree_unsafe
    _rmtree_unsafe(fullname, onerror)
  File "C:\Users\yuval\anaconda3\Lib\shutil.py", line 622, in _rmtree_unsafe
    onerror(os.unlink, fullname, sys.exc_info())
  File "C:\Users\yuval\anaconda3\Lib\shutil.py", line 620, in _rmtree_unsafe
    os.unlink(fullname)
PermissionError: [WinError 5] Access is denied: 'C:\\Users\\yuval\\AppData\\Local\\Temp\\tmpvmjhqr25\\.git\\objects\\pack\\pack-48b911d1bf9b980873e1d4b0cc2e68b52e19f74b.idx'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\main.py", line 66, in md_agent
    temp_dir = await asyncio.to_thread(
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\asyncio\threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\utils\github.py", line 40, in clone_repo_to_temp_dir
    raise ValueError(
ValueError: Error cloning git repository: [WinError 5] Access is denied: 'C:\\Users\\yuval\\AppData\\Local\\Temp\\tmpvmjhqr25\\.git\\objects\\pack\\pack-48b911d1bf9b980873e1d4b0cc2e68b52e19f74b.idx'

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\yuval\anaconda3\Scripts\readmeai.exe\__main__.py", line 7, in <module>
  File "C:\Users\yuval\anaconda3\Lib\site-packages\click\core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\click\core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\click\core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\click\core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\cli\commands.py", line 37, in commands
    main(
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\main.py", line 49, in main
    asyncio.run(md_agent(badges, config, config_helper))
  File "C:\Users\yuval\anaconda3\Lib\asyncio\runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\asyncio\base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "C:\Users\yuval\anaconda3\Lib\site-packages\readmeai\main.py", line 121, in md_agent
    await asyncio.to_thread(shutil.rmtree, temp_dir)
                                           ^^^^^^^^
UnboundLocalError: cannot access local variable 'temp_dir' where it is not associated with a value

Not reading Ruby files content

Hello,

Just run this on a Ruby on Rails repo and it seems to use the folder structure as a prompt to generate the summary instead of the actual content of the file. That happened to all files in the project. Installed using PIP.

Captura de Tela 2023-11-10 aฬ€s 10 27 23

Thanks

FileNotFoundError: [Errno 2] No such file or directory: 'tree'

Hello, I found your tool today and I am a big fan. However, I am getting this error when trying it on my remote repository.
Thanks for your help

Traceback (most recent call last):

  File "/Users/jkh98/Developer/Scratch/README-AI/src/main.py", line 103, in <module>
    app()

  File "/Users/jkh98/Developer/Scratch/README-AI/src/main.py", line 41, in main
    asyncio.run(generate_readme(api_key, local, output, remote))

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^

  File "/Users/jkh98/Developer/Scratch/README-AI/src/main.py", line 91, in generate_readme
    builder.build(conf, conf_helper, dependencies, documentation, intro_slogan)

  File "/Users/jkh98/Developer/Scratch/README-AI/src/builder.py", line 60, in build
    md_repo = create_directory_tree(url)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/jkh98/Developer/Scratch/README-AI/src/builder.py", line 207, in create_directory_tree
    tree_bytes = subprocess.check_output(["tree", "-n", repo_path])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/subprocess.py", line 466, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/subprocess.py", line 548, in run
    with Popen(*popenargs, **kwargs) as process:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/subprocess.py", line 1024, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,

  File "/opt/homebrew/Cellar/[email protected]/3.11.3/Frameworks/Python.framework/Versions/3.11/lib/python3.11/subprocess.py", line 1917, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)

FileNotFoundError: [Errno 2] No such file or directory: 'tree'

error installing in ubuntu 22

Welcome to the README-AI environment setup script!
Checking for Python 3.8 or higher...
Python version is compatible.
Creating a new conda environment named 'readmeai' with Python 3.11 ...
Collecting package metadata (repodata.json): | setup/setup.sh: line 90:  4475 Killed                  conda env create -f setup/environment.yaml
Error creating the 'readmeai' environment. Aborting.
(base) root@ubuntu22:~/readme-ai# python3 --version
Python 3.11.5
(base) root@ubuntu22:~/readme-ai#

how can solve it?

Update tiktoken Package Version to 0.6.0 for Python 3.11 Support [pip OPTION]

Currently, the tiktoken package is on version 0.4.0, which does not support Python 3.11 and newer versions. The 0.6.0 version of the package has updates to support Python 3.11 and above. I kindly request the owner to update the tiktoken package version to 0.6.0 to ensure compatibility with the latest Python versions. This will provide a better user experience for those who wish to use the package with the latest Python versions. I appreciate your consideration of this request.

Traceback stack from recent use of app

Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/bin/readmeai", line 5, in
from readmeai.cli.main import main
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/readmeai/cli/main.py", line 9, in
from readmeai._agent import readme_agent
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/readmeai/_agent.py", line 22, in
from readmeai.models.dalle import DalleHandler
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/readmeai/models/dalle.py", line 7, in
from openai import Client, OpenAIError
ImportError: cannot import name 'Client' from 'openai' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/init.py)

  • Program doesn't run with this activation:
export OPENAI_API_KEY = <TOKEN>
$ readmeai --badge-style flat-square --repository path/to/repo --output readme_doc_gen.md 

Am I missing something obvious? I've installed openai python lib. I used pipx to install.
From OpenAI docs I see a different way to initalize client than the one provided in dalle.py.

Docker image not working

I found this project a few weeks ago and thought it was really cool. But I can't build it anymore, I tried manual install the same way I did it last time. I tried the latest pip package, as well as the 0.0.4 version. I also tried the latest Docker image and the 0.0.4 version, but that doesn't work either. All errors are coming from not being able to import the local python packages.

Richards-MBP:readme-ai richardroberson$ docker run -it \
> -e OPENAI_API_KEY="my-key" \
> -v "$(pwd)":/app \
> -w /app zeroxeli/readme-ai:0.0.4 \
> python src/main.py -o readme-ai.md -r https://github.com/richardr1126/ShutesAndLaddersGUI
python: can't open file '/app/src/main.py': [Errno 2] No such file or directory

HTTPStatus Exception: Client error '401 Unauthorized' for url 'https://api.openai.com/v1/chat/completions'

When adding my API key for openAI in an attempt to generate a readme file, I receive the following error multiple times.
HTTPStatus Exception:
Client error '401 Unauthorized' for URL 'https://api.openai.com/v1/chat/completions'

Here is my input, replacing my API key for security:
readmeai --api-key "OPEN_AI_API_KEY" --output README.md --repository https://github.com/TravisHammonds/decoder-ring.git

I then get the following output...
Screenshot 2023-10-17 at 12 03 47 PM

And the README file looks like this...
Screenshot 2023-10-17 at 12 07 10 PM

I have made sure that the API key has been input correctly and that the spelling and case sensitivity match. Any help would be appreciated.

Disclaimer: I am new to API integrations with openAI.

OSError: [Errno 36] File name too long (docker)

INFO README-AI is now executing.
INFO Successfully validated OpenAI API key.
INFO Model: {'endpoint': 'https://api.openai.com/v1/chat/completions', 'engine': 'gpt-3.5-turbo', 'encoding': 'cl100k_base', 'rate_limit': 5, 'tokens': 600, 'tokens_max': 4000, 'temperature': 0.9, 'api_key': '****************'}
INFO Repository: GitConfig(repository='https://github.com/yusing/Torrenium', name='Torrenium')
Traceback (most recent call last):
File "/usr/local/bin/readmeai", line 8, in
sys.exit(cli())
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.9/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/readmeai/main.py", line 130, in cli
asyncio.run(main(api_key, output, repository))
File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
return future.result()
File "/usr/local/lib/python3.9/site-packages/readmeai/main.py", line 27, in main
await generate_readme(llm)
File "/usr/local/lib/python3.9/site-packages/readmeai/main.py", line 34, in generate_readme
dependencies, file_text = get_dependencies(scanner, repository)
File "/usr/local/lib/python3.9/site-packages/readmeai/main.py", line 88, in get_dependencies
dependencies, file_text = scanner.get_dependencies(repository)
File "/usr/local/lib/python3.9/site-packages/readmeai/preprocess.py", line 34, in get_dependencies
dependencies = self.parser.get_dependency_file_contents(contents)
File "/usr/local/lib/python3.9/site-packages/readmeai/preprocess.py", line 89, in get_dependency_file_contents
parsed_content = parser(content["content"])
File "/usr/local/lib/python3.9/site-packages/readmeai/parse.py", line 196, in parse_cmake
with open(file_path) as f:
OSError: [Errno 36] File name too long: '# Project-level configuration.\ncmake_minimum_required(VERSION 3.10)\nproject(runner LANGUAGES CXX)\n\n# The name of the executable created for the application. Change this to change\n# the on-disk name of your application.\nset(BINARY_NAME "torrenium")\n# The unique GTK application identifier for this application. See:\n# https://wiki.gnome.org/HowDoI/ChooseApplicationID\nset(APPLICATION_ID "com.yusing.torrenium")\n\n# Explicitly opt in to modern CMake behaviors to avoid warnings with recent\n# versions of CMake.\ncmake_policy(SET CMP0063 NEW)\n\n# Load bundled libraries from the lib/ directory relative to the binary.\nset(CMAKE_INSTALL_RPATH "$ORIGIN/lib")\n\n# Root filesystem for cross-building.\nif(FLUTTER_TARGET_PLATFORM_SYSROOT)\n set(CMAKE_SYSROOT ${FLUTTER_TARGET_PLATFORM_SYSROOT})\n set(CMAKE_FIND_ROOT_PATH ${CMAKE_SYSROOT})\n set(CMAKE_FIND_ROOT_PATH_MODE_PROGRAM NEVER)\n set(CMAKE_FIND_ROOT_PATH_MODE_PACKAGE ONLY)\n set(CMAKE_FIND_ROOT_PATH_MODE_LIBRARY ONLY)\n set(CMAKE_FIND_ROOT_PATH_MODE_INCLUDE ONLY)\nendif()\n\n# Define build configuration options.\nif(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)\n set(CMAKE_BUILD_TYPE "Debug" CACHE\n STRING "Flutter build mode" FORCE)\n set_property(CACHE CMAKE_BUILD_TYPE PROPERTY STRINGS\n "Debug" "Profile" "Release")\nendif()\n\n# Compilation settings that should be applied to most targets.\n#\n# Be cautious about adding new options here, as plugins use this function by\n# default. In most cases, you should add new options to specific targets instead\n# of modifying this function.\nfunction(APPLY_STANDARD_SETTINGS TARGET)\n target_compile_features(${TARGET} PUBLIC cxx_std_14)\n target_compile_options(${TARGET} PRIVATE -Wall -Werror)\n target_compile_options(${TARGET} PRIVATE "$&lt;$<NOT:$CONFIG:Debug>:-O3>")\n target_compile_definitions(${TARGET} PRIVATE "$&lt;$<NOT:$CONFIG:Debug>:NDEBUG>")\nendfunction()\n\n# Flutter library and tool build rules.\nset(FLUTTER_MANAGED_DIR "${CMAKE_CURRENT_SOURCE_DIR}/flutter")\nadd_subdirectory(${FLUTTER_MANAGED_DIR})\n\n# System-level dependencies.\nfind_package(PkgConfig REQUIRED)\npkg_check_modules(GTK REQUIRED IMPORTED_TARGET gtk+-3.0)\n\nadd_definitions(-DAPPLICATION_ID="${APPLICATION_ID}")\n\n# Define the application target. To change its name, change BINARY_NAME above,\n# not the value here, or flutter run will no longer work.\n#\n# Any new source files that you add to the application should be added here.\nadd_executable(${BINARY_NAME}\n "main.cc"\n "my_application.cc"\n "${FLUTTER_MANAGED_DIR}/generated_plugin_registrant.cc"\n)\n\n# Apply the standard set of build settings. This can be removed for applications\n# that need different build settings.\napply_standard_settings(${BINARY_NAME})\n\n# Add dependency libraries. Add any application-specific dependencies here.\ntarget_link_libraries(${BINARY_NAME} PRIVATE flutter)\ntarget_link_libraries(${BINARY_NAME} PRIVATE PkgConfig::GTK)\n\n# Run the Flutter tool portions of the build. This must not be removed.\nadd_dependencies(${BINARY_NAME} flutter_assemble)\n\n# Only the install-generated bundle's copy of the executable will launch\n# correctly, since the resources must in the right relative locations. To avoid\n# people trying to run the unbundled copy, put it in a subdirectory instead of\n# the default top-level location.\nset_target_properties(${BINARY_NAME}\n PROPERTIES\n RUNTIME_OUTPUT_DIRECTORY "${CMAKE_BINARY_DIR}/intermediates_do_not_run"\n)\n\n# Generated plugin build rules, which manage building the plugins and adding\n# them to the application.\ninclude(flutter/generated_plugins.cmake)\n\n\n# === Installation ===\n# By default, "installing" just makes a relocatable bundle in the build\n# directory.\nset(BUILD_BUNDLE_DIR "${PROJECT_BINARY_DIR}/bundle")\nif(CMAKE_INSTALL_PREFIX_INITIALIZED_TO_DEFAULT)\n set(CMAKE_INSTALL_PREFIX "${BUILD_BUNDLE_DIR}" CACHE PATH "..." FORCE)\nendif()\n\n# Start with a clean build bundle directory every time.\ninstall(CODE "\n file(REMOVE_RECURSE \"${BUILD_BUNDLE_DIR}/\")\n " COMPONENT Runtime)\n\nset(INSTALL_BUNDLE_DATA_DIR "${CMAKE_INSTALL_PREFIX}/data")\nset(INSTALL_BUNDLE_LIB_DIR "${CMAKE_INSTALL_PREFIX}/lib")\n\ninstall(TARGETS ${BINARY_NAME} RUNTIME DESTINATION "${CMAKE_INSTALL_PREFIX}"\n COMPONENT Runtime)\n\ninstall(FILES "${FLUTTER_ICU_DATA_FILE}" DESTINATION "${INSTALL_BUNDLE_DATA_DIR}"\n COMPONENT Runtime)\n\ninstall(FILES "${FLUTTER_LIBRARY}" DESTINATION "${INSTALL_BUNDLE_LIB_DIR}"\n COMPONENT Runtime)\n\nforeach(bundled_library ${PLUGIN_BUNDLED_LIBRARIES})\n install(FILES "${bundled_library}"\n DESTINATION "${INSTALL_BUNDLE_LIB_DIR}"\n COMPONENT Runtime)\nendforeach(bundled_library)\n\n# Fully re-copy the assets directory on each build to avoid having stale files\n# from a previous install.\nset(FLUTTER_ASSET_DIR_NAME "flutter_assets")\ninstall(CODE "\n file(REMOVE_RECURSE \"${INSTALL_BUNDLE_DATA_DIR}/${FLUTTER_ASSET_DIR_NAME}\")\n " COMPONENT Runtime)\ninstall(DIRECTORY "${PROJECT_BUILD_DIR}/${FLUTTER_ASSET_DIR_NAME}"\n DESTINATION "${INSTALL_BUNDLE_DATA_DIR}" COMPONENT Runtime)\n\n# Install the AOT library on non-Debug builds only.\nif(NOT CMAKE_BUILD_TYPE MATCHES "Debug")\n install(FILES "${AOT_LIBRARY}" DESTINATION "${INSTALL_BUNDLE_LIB_DIR}"\n COMPONENT Runtime)\nendif()\n'

FileExistsError: [Errno 17] File exists: '/tmp/tmpvett9j9p'

OS: Fedora Linux 38

Command:

python src/main.py --api-key 12345 --output docs/README.md --local /home/pc/dev/project

Error:

2023-05-11 11:23:13 | INFO | logger:info:32 - README-AI is now executing.
2023-05-11 11:23:13 | INFO | logger:info:32 - Using local directory: /home/pc/dev/project
Traceback (most recent call last):

  File "/home/pc/README-AI/src/main.py", line 133, in <module>
    app()

  File "/home/pc/README-AI/src/main.py", line 32, in main
    asyncio.run(generate_readme(api_key, local, output, remote))

  File "/home/pc/.conda/envs/readmeai/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)

  File "/home/pc/.conda/envs/readmeai/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()

  File "/home/pc/README-AI/src/main.py", line 51, in generate_readme
    dependencies = preprocess.get_project_dependencies(repo, file_exts, file_names)

  File "/home/pc/README-AI/src/preprocess.py", line 201, in get_project_dependencies
    _clone_or_copy_repository(repo, temp_dir)

  File "/home/pc/README-AI/src/preprocess.py", line 46, in _clone_or_copy_repository
    shutil.copytree(repo, temp_dir)

  File "/home/pc/.conda/envs/readmeai/lib/python3.10/shutil.py", line 559, in copytree
    return _copytree(entries=entries, src=src, dst=dst, symlinks=symlinks,

  File "/home/pc/.conda/envs/readmeai/lib/python3.10/shutil.py", line 457, in _copytree
    os.makedirs(dst, exist_ok=dirs_exist_ok)

  File "/home/pc/.conda/envs/readmeai/lib/python3.10/os.py", line 225, in makedirs
    mkdir(name, mode)

FileExistsError: [Errno 17] File exists: '/tmp/tmpvett9j9p'

Client error '404 Not Found' for url 'https://api.openai.com/v1/chat/completions'

i get an 404 error

readmeai --api-key "xxxxx" -o readme-ai.md -r /home/ben/REPO/fv-deploiement-applicatif

INFO README-AI is now executing.
INFO Successfully validated OpenAI API key.
INFO Model: {'endpoint': 'https://api.openai.com/v1/chat/completions', 'engine': 'gpt-4', 'encoding': 'cl100k_base', 'rate_limit': 5, 'tokens': 699, 'tokens_max': 3999, 'temperature': 0.9, 'api_key': '****************'}
INFO Repository: GitConfig(repository='/home/ben/REPO/fv-deploiement-applicatif', name='fv-deploiement-applicatif')
INFO Successfully cloned /home/ben/REPO/fv-deploiement-applicatif to /tmp/tmpl2vq0tfd.
INFO Dependencies: ['', 'xxxxxx']
INFO Total files: 42
WARNING Ignoring file: README.md
WARNING Ignoring file: ansible.cfg
ERROR HTTPStatus Exception:
Client error '404 Not Found' for url 'https://api.openai.com/v1/chat/completions'
For more information check: https://httpstatuses.com/404
ERROR HTTPStatus Exception:
Client error '404 Not Found' for url 'https://api.openai.com/v1/chat/completions'

requet to test the endpoint: (works)

(base) conf|main โ‡’ curl --location 'https://api.openai.com/v1/chat/completions'
--header 'Content-Type: application/json'
--header 'Authorization: Bearer sk-xxxxx'
--data '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "What is the OpenAI mission?Put the result in JSON format"}]
}'

{
"id": "chatcmpl-80sisBYMJVhyvN40gpAGE9E7JPTHB",
"object": "chat.completion",
"created": 1695221046,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "{\n "mission": "The mission of OpenAI is to ensure that artificial general intelligence (AGI) benefits all of humanity. They aim to build safe and beneficial AGI or to aid others in achieving this outcome. OpenAI commits to long-term safety, conducting research to make AGI safe, and driving the broad adoption of safety measures across the AI community. Additionally, they strive to cooperate with other institutions to address AGI's global challenges and distribute its benefits equitably."\n}"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 20,
"completion_tokens": 98,
"total_tokens": 118
}
}

conf file:

OpenAI API Settings

[api]
endpoint = "https://api.openai.com/v1/chat/completions"
engine = "gpt-3.5-turbo"
encoding = "cl100k_base"
rate_limit = 5
tokens = 669
tokens_max = 3899
temperature = 1.2

Repository

[git]
repository = "https://github.com/eli64s/readme-ai"
name = "readme-ai"

ERROR HTTPStatus Exception: Client error '429 Too Many Requests' for url 'https://api.openai.com/v1/chat/completions'

I am using this with Docker. Here I providing some error messages--

my command is -

docker run -it \ -e OPENAI_API_KEY=API_KEY \ -v "$(pwd)":/app zeroxeli/readme-ai:latest \ readmeai -o readme-ai.md -r https://github.com/ataur39n-sharif/book-catelog-backend

Error -

ERROR HTTPStatus Exception:
Client error '429 Too Many Requests' for url 'https://api.openai.com/v1/chat/completions'
For more information check: https://httpstatuses.com/429

images -

doc-command
doc-0
doc-1
doc-2

doc-command

ValueError: Shape of passed values is (27, 1), indices imply (27, 2)

I'm having this trouble

| ๐Ÿ“ˆ Scalability | The codebase is designed to be scalable and modular, with various components separated into their own packages to allow for easy management and organization. The heavy use of external dependencies could potentially lead to issues with scalability, and further testing would be needed to determine the system's ability to handle larger workloads. |
Traceback (most recent call last):
File "/home/cesar/python/README-AI/src/main.py", line 144, in
asyncio.run(main(args.api_key, args.output, args.repository))
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/home/cesar/python/README-AI/src/main.py", line 54, in main
await generate_readme(gpt)
File "/home/cesar/python/README-AI/src/main.py", line 74, in generate_readme
builder.build_markdown(CONF, CONF_HELPER, dependencies, code_summary)
File "/home/cesar/python/README-AI/src/builder.py", line 24, in build_markdown
readme_parts = create_readme_parts(conf, conf_helper, packages, summaries)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/cesar/python/README-AI/src/builder.py", line 47, in create_readme_parts
summary_df = create_markdown_tables(summaries)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/cesar/python/README-AI/src/builder.py", line 106, in create_markdown_tables
summary_df = pd.DataFrame(summaries, columns=["Module", "Summary"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/site-packages/pandas/core/frame.py", line 798, in init
mgr = ndarray_to_mgr(
^^^^^^^^^^^^^^^
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 337, in ndarray_to_mgr
_check_values_indices_shape_match(values, index, columns)
File "/home/cesar/anaconda3/envs/readme_ai/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 408, in _check_values_indices_shape_match
raise ValueError(f"Shape of passed values is {passed}, indices imply {implied}")
ValueError: Shape of passed values is (27, 1), indices imply (27, 2)

what could be?

Error while running main.py => Got unexpected extra argument

I installed python3 and did all the setup.

When running the command
python3 src/main.py --output README_AI.md /Users/sachintomar/github/ethereum_rpc_client

I'm getting below error

Usage: main.py [OPTIONS]
Try 'main.py --help' for help.

Error: Got unexpected extra argument (/Users/sachintomar/github/ethereum_rpc_client)

can not generate readme => KeyError: 'build.gradle.kts'

Hi, I have a repository with nuxt3 and tauri (alpha2) and it fail to generate readme

INFO     README-AI is now executing.
INFO     Successfully validated OpenAI API key.
INFO     Model: {'endpoint': 'https://api.openai.com/v1/chat/completions', 'engine': 'gpt-3.5-turbo', 'encoding': 'cl100k_base', 'rate_limit': 5, 'tokens': 669, 'tokens_max': 3899, 'temperature': 1.2, 'api_key': '****************'}
INFO     Repository: GitConfig(repository='../la-caisse', name='la-caisse')
INFO     Successfully cloned ../la-caisse to /var/folders/yk/1zcj86td3xn84_ty8414k95c0000gn/T/tmp8z2bf1rk.
Traceback (most recent call last):
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/./venv/bin/readmeai", line 8, in <module>
    sys.exit(cli())
             ^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/main.py", line 131, in cli
    asyncio.run(main(api_key, output, repository))
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/main.py", line 28, in main
    await generate_readme(llm)
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/main.py", line 35, in generate_readme
    dependencies, file_text = get_dependencies(scanner, repository)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/main.py", line 89, in get_dependencies
    dependencies, file_text = scanner.get_dependencies(repository)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/preprocess.py", line 32, in get_dependencies
    dependencies = self.parser.get_dependency_file_contents(contents)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/benoitdeveaux/Documents/ugo/readme.ai/venv/lib/python3.11/site-packages/readmeai/preprocess.py", line 86, in get_dependency_file_contents
    parser = file_parsers[content["name"]]
             ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
KeyError: 'build.gradle.kts'

I tried to add src-tauri to ignore file but without success

EDIT: look like there is an issue with conf file, I tried to edit the conf folder in site-packages/readmeai but it did not work (commented out

    # 'build.gradle',
    # 'build.gradle.kts',
    # 'pom.xml',

Support for Java

I see you currently support Python, Go, Rust, JavaScript.
I was wondering, is there a timeframe for when you will add support for JAVA?

Application unable to accept GPT API key

Perhaps this is related to issue #76 , but I am unable to get the application to recognise my paid API key. I've attempted with both docker and conda. I'm on Ubuntu 22.04.3 with python 3.9.18

Using Docker
sudo docker run -it -e OPENAI_API_KEY=Key_went_here -v "$(pwd)":/app zeroxeli/readme-ai:latest readmeai -o readme-ai.md -r Repo_went_here

Using Conda
First
export OPENAI_API_KEY=YOUR_API_KEY
Then
conda create -n readmeai python=3.9
and I enter the conda enviornment with
source activate readmeai
and install requirements.txt with
pip install -r requirements.txt

Finally running the application with
readmeai --output readme-ai.md --repository Repo_goes_here

Both implementations give me
"ERROR HTTPStatus Exception:
Client error '404 Not Found' for url 'https://api.openai.com/v1/chat/completions'
For more information check: https://httpstatuses.com/404
with the link saying
"{
"error": {
"message": "You didn't provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you're accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.",
"type": "invalid_request_error",
"param": null,
"code": null
}
}"

I even tried to modify the https://github.com/eli64s/readme-ai/blob/4cc83dc49e0f5bf3b304d504394a9adf7c028acb/readmeai/core/model.py#L176C1-L176C75 to directly include my API key as a string, but this did not fix the issue.

Feature Request: Extend configuration to include the openai endpoint and model.

Hi!

Firstly; Loving this project - I can definitely see this making a team a lot more efficient, while also making sure that the Readme doesn't end up being dead documentation that's never up to date.

A feature request is to make the configuration more extendable. Personally I am using the azure openai which has a different endpoint for conversations. Furthermore it looks like this is running on the gpt-3.5-turbo model and I have the gpt-4 model available so would love to be able to switch and see if it did a better or worse job.

FileNotFoundError: [Errno 2] No such file or directory: 'docs/html/readme.html'

Hey, just trying this repo as I really like the idea of using davinci to generate doc.

Here is a bug I found while running src/main.py :

  File "/home/.../PydocsAI/src/utils.py", line 38, in write_html
    with open(self._get_file_path(file_name), "w") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'docs/html/readme.html'

I think this is because docs/html is not there when pulling the repo :)

ValueError: 3 columns passed, passed data had 2 columns

python src/main.py --o readme-ai.md -r https://github.com/tuipoland/tui-infastructure-terraform-module-documentdb

INFO     README-AI is now executing.
Traceback (most recent call last):
  File "/home/ll1rro/.local/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 934, in _finalize_columns_and_data
    columns = _validate_or_indexify_columns(contents, columns)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/.local/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 981, in _validate_or_indexify_columns
    raise AssertionError(
AssertionError: 3 columns passed, passed data had 2 columns

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/ll1rro/Downloads/readme-ai/src/main.py", line 137, in <module>
    asyncio.run(main(args.api_key, args.output, args.repository))
  File "/usr/lib64/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/home/ll1rro/Downloads/readme-ai/src/main.py", line 48, in main
    await generate_readme(gpt)
  File "/home/ll1rro/Downloads/readme-ai/src/main.py", line 55, in generate_readme
    dependencies, file_text = get_dependencies(scanner, repository)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/Downloads/readme-ai/src/main.py", line 75, in get_dependencies
    dependencies, file_text = scanner.get_dependencies(repository)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/Downloads/readme-ai/src/preprocess.py", line 33, in get_dependencies
    contents = self.parser.analyze(repository, is_remote)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/Downloads/readme-ai/src/preprocess.py", line 85, in analyze
    df = self.process_language_mapping(df)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/Downloads/readme-ai/src/preprocess.py", line 113, in process_language_mapping
    df[["install", "run", "test"]] = pd.DataFrame.from_records(
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/.local/lib/python3.11/site-packages/pandas/core/frame.py", line 2266, in from_records
    arrays, arr_columns = to_arrays(data, columns)
                          ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/.local/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 840, in to_arrays
    content, columns = _finalize_columns_and_data(arr, columns, dtype)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ll1rro/.local/lib/python3.11/site-packages/pandas/core/internals/construction.py", line 937, in _finalize_columns_and_data
    raise ValueError(err) from err
ValueError: 3 columns passed, passed data had 2 columns

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.