The Simple AI Reviewer is designed to assist developers working alone. It aims to highlight potential issues within a codebase, offer suggestions for improvements, or simply provide a fresh perspective.
Its outputs may be insightful, misleading, or entirely useless — treat its suggestions with a healthy dose of skepticism. This tool is intended as a supplement to your own expertise, not a replacement for it.
Currently supports:
- Gitea
 - Github
 
1️⃣ Clone the repository
git clone https://github.com/amest/simple-ai-reviewer.git2️⃣ Configure Git hosting:
- If using Gitea - configure Gitea
 - If using Github - configure Github
 
To run the application using Docker, follow these steps:
- 
Configure the application using environment variables. You can find a sample configuration in the
compose.ymlfile. For all available configuration options, see the Application configurations section or the appsettings.json file. - 
Build and start the application using docker compose:
 
docker-compose up --buildThis will:
- Build the Docker image
 - Start the application container
 - Expose the service on the configured port (default: 8888)
 
docker run -d \
  --name simple-ai-reviewer \
  -p 8888:8888 \
  -e web__token=AUTH_TOKEN_FOR_WEBHOOKS_GENERATE_AND_CHANGE_ME \
  -e web__port=8888 \
  -e web__host=0.0.0.0 \
  -e gitea__base_url=GITEA_BASE_URL \
  -e gitea__token=GITEA_ACCESS_TOKEN \
  -e gitea__allowed_emails=* \
  -e github__token=GITHUB_PRIVATE_ACCESS_TOKEN \
  -e github__allowed_logins=* \
  -e llm__type=ollama \
  -e llm__model=gemma3:12b \
  -e llm__base_url=http://192.168.0.60:11434 \
  -e review__review_as_conversations=true \
  --restart unless-stopped \
  eluki/simple-ai-reviewer
To run the application without Docker, follow these steps:
- 
Configure the application by editing the appsettings.json file. For detailed configuration options with comments, see the Application configurations section.
 - 
Install the required dependencies:
 
pip install -r requirements.txt- Run the service:
 
python main.pyThe service will start and listen on the configured host and port (default: localhost:8888).
- Create user (example: 
ai-reviewer) Optional - Create Token for user
 - Add this user to your repositories as co-author Optional if user created
 - Configure webhook in your Gitea repository.
- Endpoint: 
https://your-domain-with-running-service/webhook/gitea?token=TOKEN_FROM_APPSETTINGS - Content type: 
application/json - Select webhook events: 'Pull Request Comment' (required) and/or 'Issue Comment' (optional)
 
 - Endpoint: 
 
- Create user (example: 
{generated-username}-ai-reviewer) Optional - Create Token for user
- Contents: Read-only
 - Issues: Read and write
 - Pull requests: Read and write
 
 - Add this user to your repositories as co-author Optional if user created
 - Configure webhook in your Github repository.
- Endpoint: 
https://your-domain-with-running-service/webhook/github?token=TOKEN_FROM_APPSETTINGS - Content type: 
application/json - Select only one webhook event: Issue comments
 
 - Endpoint: 
 
| Section | Parameter | Description | 
|---|---|---|
web | 
host | 
bind address | 
web | 
port | 
bind port | 
web | 
token | 
authorization token for connection webhooks securely | 
gitea | 
base_url | 
your gitea url | 
gitea | 
token | 
personal access token for access to api | 
gitea | 
allowed_emails | 
email list (separated by ; or ,) with users, who can run review | 
github | 
token | 
personal access token for access to api | 
github | 
allowed_logins | 
logins list (separated by ; or ,) with users, who can run review | 
llm | 
type | 
provider type: ollama, openai-compatible | 
llm | 
base_url | 
url to ollama or openai compatible server | 
llm | 
model | 
model name for using in service | 
llm | 
token | 
api token for connect to openai-compatible server | 
review | 
language | 
prompt language: ru, en | 
review | 
ignore_files | 
list file names (separated by ,) excluded from review | 
review | 
review_as_comments | 
flag for enable/disable process review and send as simple comment in Pull Request | 
review | 
review_as_conversations | 
flag for enable/disable process review and send as conversations to files in pull request (comments applied to files in Changes) | 
- Create a Pull Request in your Gitea/Github repository
 - Write comment in PR with text 
/start_review - 🤖 The AI Reviewer will automatically analyze the changes and post comments
 - 🔍 Review the suggestions and apply them as needed
 
- This is a basic implementation and may require adjustments based on your project's specific needs.
 - Consider providing feedback to help improve the accuracy and usefulness of the AI reviewer.
 - Remember to validate the reviewer's findings and use your own judgment.
 - ❗Some file reviews may miss lines to which the comment pertains (while this feature exists, it depends on the model used)
 - ❗In principle, a lot depends on the models used to conduct the review. Testing was done with the following models:
OllamaGemma3:1b- Weak and suitable for very simple Pull Requests, but in general it finds and highlights useful things. In the per-file review mode - uselessGemma3:4b- Relatively well reviews in the regular comments mode (as far as possible for a small model), but in the per-file review mode it poorly specifies files and lines; It can link all comments to one file. Per-file mode is not even worth trying
Openai-compatibledeepseek-chat- Not bad reviews in the regular comments mode. In the per-file review mode it also successfully links comments/advice/questions to files, but sometimes (or more precisely in 50%) it misses the exact diff line, but you can't get used to it
 
