Noir Enhances AI Integration for Advanced Analysis
Noir v0.20.0 has arrived, supercharging its AI-powered security analysis capabilities. This release broadens AI collaboration beyond Ollama to include a diverse set of providers like OpenAI, xAI, GitHub Marketplace, and more. This article dives into the new AI-related features and how they enhance Noir’s flexibility and performance.
- OpenAI
- xAI
- GitHub Marketplace
- Ollama
- LM Studio
- vLLM
- Other custom completions API services
While there are various improvements, this article focuses on the AI-related features.
Exploring New AI Features
Noir v0.20.0 introduces flexible AI integration with three new flags: --ai-provider
, --ai-model
, and --ai-key
. These are required to enable AI features and let you customize your analysis:
--ai-provider
: Choose a provider (e.g.,openai
) or specify a custom API URL (e.g.,http://my-custom-api:9000
).--ai-model
: Define the AI model (e.g.,gpt-4
).--ai-key
: Authenticate with your API key (or use theNOIR_AI_KEY
environment variable for security).
The --ai-provider
flag currently supports the following list, and you can also use a direct URL besides the Prefix.
Prefix | Default Host |
---|---|
openai | https://api.openai.com |
xai | https://api.x.ai |
github | https://models.github.ai |
azure | https://models.inference.ai.azure.com |
vllm | http://localhost:8000 |
ollama | http://localhost:11434 |
lmstudio | http://localhost:1234 |
For example, to scan a directory with OpenAI’s GPT-4:
# Set AI provider and model
noir -b . --ai-provider=openai --ai-model=gpt-4 --ai-key=your-api-key
To keep your API key secure, store it in NOIR_AI_KEY
or use a Key Management Service like Vault instead of passing it directly.
Hands-On Examples
Here’s how to run Noir v0.20.0 with different AI providers to scan code or test fixtures, such as the hahwul
directory.
Local LLM with Ollama
Download the llama3
model using Ollama:
ollama pull llama3
Run the llama3
model locally:
ollama serve llama3
Run the Scan:
noir -b ./spec/functional_test/fixtures/hahwul/ \
--ai-provider=ollama \
--ai-model=llama3
Local LLM with LM Studio
Open LM Studio, go to the “Models” tab, and download your preferred model (e.g., phi-4
).
LM Studio > Models
In the “Developer” tab, click “Start Server” (or press Cmd+R
).
LM Studio > Developer > Start Server
Run the Scan:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=lmstudio \
--ai-model=phi-4
with xAI
Visit xAI Console and generate an API key.
Run the Scan:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=xai \
--ai-model=grok-2-1212 \
--ai-key=xai-t...
with GitHub Models
In GitHub, go to Settings > Developer Settings > Personal Access Tokens > Generate new token (fine-grained).
https://github.com/settings/personal-access-tokens
Visit GitHub Marketplace Models, select a model, and note its name from the “Code” tab (e.g., gpt-4o).
https://github.com/marketplace/models
Select Model, and open the Code tab to see the model name
(e.g.,
OpenAI GPT-4o
-> gpt-4o
)
Run the Scan:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=github \
--ai-model=gpt-4o \
--ai-key=github_....
For more information on flag usage and AI integration, please refer to the official documentation.
Deprecated
The old --ollama
and --ollama-model
flags are now deprecated in favor of the more versatile --ai-provider
and --ai-model
flags. They still work for now, but we recommend switching to the new options to future-proof your workflows.
Conclusion
With Noir v0.20.0, you can now tap into a wide range of AI providers—whether keeping it local with LM Studio or scaling up with OpenAI and xAI—making scans faster and more precise. Personally, I’ve been using a local LLM at my company for code analysis, and it’s been a game-changer, especially in catching overlooked APIs that might slip through manual reviews.
This flexibility really shines in real-world workflows. We’re excited to keep pushing AI integration forward, with plans to expand provider support and refine prompts through automated testing. Give it a try, and let us know how it boosts your scans!