-
Notifications
You must be signed in to change notification settings - Fork 0
FAQ
Welcome to the Naftiko Framework FAQ! This guide answers common questions from developers who are learning, using, and contributing to Naftiko. For comprehensive technical details, see the Specification.
A: Naftiko Framework is the first open-source platform for Spec-Driven AI Integration. Instead of writing boilerplate code to consume APIs and expose unified interfaces, you declare them in YAML. This enables:
- API composability: Combine multiple APIs into a single capability
- Format conversion: Convert between JSON, XML, Avro, Protobuf, CSV, and YAML
- AI-ready integration: Better context engineering for AI systems
- Reduced API sprawl: Manage microservices and SaaS complexity
Use it when you need to integrate multiple APIs, standardize data formats, or expose simplified interfaces to AI agents.
A: You only need to know:
- YAML syntax - the configuration language for capabilities
- JSONPath - for extracting values from JSON responses
- Mustache templates - for injecting parameters (optional, if using advanced features)
You don't need to write Java or other code unless you want to extend the framework itself.
A: It's a runtime engine. The Naftiko Engine, provided as a Docker container, reads your YAML capability file at startup and immediately exposes HTTP or MCP interfaces. There's no compilation step - declare your capability, start the engine, and it works.
A: There are two ways:
-
Docker (recommended)
docker pull ghcr.io/naftiko/framework:v0.4 docker run -p 8081:8081 -v /path/to/capability.yaml:/app/capability.yaml ghcr.io/naftiko/framework:v0.4 /app/capability.yaml
-
CLI tool (for configuration and validation)
Download the binary for macOS, Linux, or Windows
See the Installation guide for detailed setup instructions.
A: Use the CLI validation command:
naftiko validate path/to/capability.yaml
naftiko validate path/to/capability.yaml 0.4 # Specify schema versionThis checks your YAML against the Naftiko schema and reports any errors.
A: Use the latest stable version: 0.4 (as of March 2026).
Set it in your YAML:
naftiko: "0.4"A: These are the two essential parts of every capability:
- Exposes - What your capability provides to callers (HTTP API or MCP server)
- Consumes - What external APIs your capability uses internally
Example: A capability that consumes the Notion API and GitHub API, then exposes them as a single unified REST endpoint or MCP tool.
A:
| Feature | API (REST) | MCP |
|---|---|---|
| Protocol | HTTP/REST | Model Context Protocol (JSON-RPC) |
| Best for | General-purpose integrations, web apps | AI agent-native integrations |
| Tool discovery | Manual or via OpenAPI | Automatic via MCP protocol |
| Configuration |
type: "api" with resources/operations |
type: "mcp" with tools |
| Default transport | HTTP | stdio or HTTP (streamable) |
Use API for traditional REST clients, web applications, or when you want standard HTTP semantics.
Use MCP when exposing capabilities to AI agents or Claude.
A: A namespace is a unique identifier for a consumed or exposed source, used for routing and references.
-
In consumes:
namespace: githubmeans "this is the GitHub API I'm consuming" -
In exposes:
namespace: appmeans "my exposed API is calledapp" -
In steps:
call: github.get-userroutes to the consumedgithubnamespace
Namespaces must be unique within their scope (all consumed namespaces must differ, all exposed namespaces must differ).
A: Steps enable multi-step orchestration - calling multiple APIs in sequence and combining their results.
Simple mode (direct call):
operations:
- method: GET
call: github.get-user # Call one consumed operation directly
with:
username: "{{github_username}}"Orchestrated mode (multi-step):
operations:
- name: complex-operation
method: GET
steps:
- type: call
name: step1
call: github.list-users
- type: call
name: step2
call: github.get-user
with:
username: $step1.result # Use output from step1
mappings:
- targetName: output_field
value: $.step2.userIdUse steps when your capability needs to combine data from multiple sources or perform lookups.
A:
-
callsteps: Execute a consumed operation (HTTP request) -
lookupsteps: Search through a previous step's output for matching records
Example: Call an API to list all users, then lookup which one matches a given email:
steps:
- type: call
name: list-all-users
call: hr.list-employees
- type: lookup
name: find-user-by-email
index: list-all-users
match: email
lookupValue: "{{email_to_find}}"
outputParameters:
- fullName
- departmentA: Use the with injector in your exposed operation:
Simple mode:
operations:
- method: GET
call: github.get-user
with:
username: "{{github_username}}" # From externalRefs
accept: "application/json" # Static valueOrchestrated mode (steps):
steps:
- type: call
name: fetch-user
call: github.get-user
with:
username: "{{github_username}}"The with object maps consumed operation parameter names to:
- Variable references like
{{variable_name}}injected from externalRefs - Static strings or numbers - literal values
A: Use JsonPath expressions in the value field of outputParameters:
consumes:
- resources:
- operations:
- outputParameters:
- name: userId
type: string
value: $.id # Top-level field
- name: email
type: string
value: $.contact.email # Nested field
- name: allNames
type: array
value: $.users[*].name # Array extractionCommon JsonPath patterns:
-
$.fieldName- access a field -
$.users[0].name- access array element -
$.users[*].name- extract all.namevalues from array -
$.data.user.email- nested path
A: Mappings connect step outputs to exposed operation outputs in multi-step orchestrations.
steps:
- type: call
name: fetch-db
call: notion.get-database
- type: call
name: query-db
call: notion.query-database
mappings:
- targetName: database_name # Exposed output parameter
value: $.fetch-db.dbName # From first step's output
- targetName: row_count
value: $.query-db.resultCount # From second step
outputParameters:
- name: database_name
type: string
- name: row_count
type: numberMappings tell Naftiko how to wire step outputs to your final response.
A: Add an authentication block to your consumes section:
consumes:
- type: http
namespace: github
baseUri: https://api.github.com
authentication:
type: bearer
token: "{{github_token}}" # Use token from externalRefsSupported authentication types:
-
bearer- Bearer token -
basic- Username/password -
apikey- Header or query parameter API key -
digest- HTTP Digest authentication
A: Use externalRefs to declare variables that are injected at runtime:
externalRefs:
- name: secrets
type: environment
resolution: runtime
keys:
github_token: GITHUB_TOKEN # Maps env var to template variable
notion_token: NOTION_TOKEN
consumes:
- namespace: github
authentication:
type: bearer
token: "{{github_token}}" # Use the injected variable
- namespace: notion
authentication:
type: bearer
token: "{{notion_token}}"At runtime, provide environment variables:
docker run -e GITHUB_TOKEN=ghp_xxx -e NOTION_TOKEN=secret_xxx ...
โ ๏ธ Security note: Useresolution: runtimein production (notfile). Never commit secrets to your repository.
A: Yes, add authentication to your exposes block:
exposes:
- type: api
port: 8081
namespace: my-api
authentication:
type: apikey
in: header
name: X-Api-Key
value: "{{api_key}}"
resources:
- path: /data
description: Protected data endpointSupported authentication types for exposed endpoints:
-
apikey- API key via header or query parameter -
bearer- Bearer token validation -
basic- Username/password via HTTP Basic Auth
A: Yes, use the body field for request bodies:
consumes:
- resources:
- operations:
- method: POST
body:
type: json
data:
filter:
status: "active"Body types:
-
json- JSON object or string -
text,xml,sparql- Plain text payloads -
formUrlEncoded- URL-encoded form -
multipartForm- Multipart file upload
A: Use path parameters with curly braces:
exposes:
- resources:
- path: /users/{userId}/projects/{projectId}
description: Get a specific project for a user
inputParameters:
- name: userId
in: path
type: string
description: The user ID
- name: projectId
in: path
type: string
description: The project IDCallers access it as: GET /users/123/projects/456
A: Use in field in inputParameters:
inputParameters:
- name: filter
in: query
type: string
description: Filter results
- name: Authorization
in: header
type: string
description: Auth header
- name: X-Custom
in: header
type: string
description: Custom headerCallers send: GET /endpoint?filter=value with custom headers.
A: Use forward to pass requests through to a consumed API without transformation:
exposes:
- resources:
- path: /github/{path}
description: Pass-through proxy to GitHub API
forward:
targetNamespace: github
trustedHeaders:
- Authorization
- AcceptThis forwards GET /github/repos/owner/name to GitHub's /repos/owner/name.
Trusted headers must be explicitly listed for security.
A: Use type: mcp in exposes instead of type: api:
exposes:
- type: mcp
address: localhost
port: 9091
namespace: my-mcp
description: My MCP server
tools:
- name: query-database
description: Query the database
call: notion.query-db
with:
db_id: "fixed-db-id"
outputParameters:
- type: array
mapping: $.resultsA:
| Transport | Use Case | Setup |
|---|---|---|
| HTTP | Streamable HTTP transport, integrates with existing infrastructure | Specify address and port
|
| stdio | Direct process communication, native integration with Claude Desktop | No address/port needed |
For Claude integration, stdio is typically preferred. HTTP is useful for remote or containerized deployments.
A: Add resources and prompts sections to your MCP server:
exposes:
- type: mcp
resources:
- uri: file:///docs/guide.md
name: User Guide
description: API usage guide
prompts:
- name: analyze-code
description: Analyze code snippet
template: "Analyze this code:\n{{code}}"MCP clients can then discover and use these resources dynamically.
A:
-
Validate your YAML first:
naftiko validate capability.yaml
-
Check the Docker logs:
docker run ... ghcr.io/naftiko/framework:v0.4 /app/capability.yaml # Look for error messages in the output -
Verify your file path - if using Docker, ensure:
- The volume mount is correct:
-v /absolute/path:/app/capability.yaml - The file exists and is readable
- For Docker on Windows/Mac, use proper path translation
- The volume mount is correct:
-
Check external services - ensure:
- APIs you're consuming are reachable
- Network connectivity is available
- Authentication credentials are correct
A:
- Check the request format - ensure headers, parameters, and body match your definition
- Verify consumed API availability - test the underlying API directly
- Inspect JsonPath mappings - ensure your extraction paths match the API response
- Use Docker logs - see server-side error messages
A:
-
Test your JsonPath - use an online tool like jsonpath.com
-
Inspect the actual response - add an operation without filtering to see raw data
-
Understand array syntax:
-
$.users[0]- first element -
$.users[*]- all elements (creates array output) -
$.users[*].name- all names
-
-
For nested objects, trace the path step-by-step:
$.data.user.profile.email
A:
-
Check parameter names match - consumed parameter names must match keys in
with -
Verify parameter location (
in: path,in: query,in: header, etc.) -
Check variable references - ensure
{{variable_name}}variables are defined in externalRefs -
Test without transformation - use
forwardto proxy the request and see if underlying API works
A:
- Test credentials directly - verify your token/key works with the API
- Check token format - ensure it's a valid token (not expired, wrong format, etc.)
- Verify placement - is the token in the right header/query/body?
-
Environment variables - ensure the Docker environment variable matches the key name in
externalRefs - Quotes - make sure tokens with special characters are properly quoted in YAML
A: We welcome all contributions! Here's how:
-
Report bugs or request features - GitHub Issues
- Search for existing issues first to avoid duplicates
-
Submit code changes - GitHub Pull Requests
- Create a local branch
- Ensure your code passes all build validation
- Rebase on
mainbefore submitting
-
Contribute examples - Add capability examples to the repository
- Document your use case in the example
- Include comments explaining key features
-
Improve documentation - Fix typos, clarify docs, add examples
A: Naftiko is a Java project using Maven. To build and develop:
# Clone the repository
git clone https://github.com/naftiko/framework.git
cd framework
# Build the project
mvn clean install
# Run tests
mvn test
# Build Docker image
docker build -t naftiko:local .Key directories:
-
src/main/java/io/naftiko/Core engine code -
src/main/resources/schemas/JSON Schema definitions -
src/test/Unit and integration tests -
src/main/resources/specs/Specification proposals and examples
A:
- Keep the Naftiko Specification as a first-class citizen - refer to it often
- Don't expose unused input parameters - every parameter should be used in steps
- Don't declare consumed outputs you don't use - be precise in mappings
- Don't prefix variables unnecessarily - let scope provide clarity
Example:
# Good: expose only used input
inputParameters:
- name: database_id # Used in step below
in: path
# Bad: expose unused input
inputParameters:
- name: database_id
- name: unused_param # Never used anywhere
# Good: output only consumed outputs you map
outputParameters:
- name: result
value: $.step1.output # Clearly mapped
# Bad: declare outputs you don't use
outputParameters:
- name: unused_result
value: $.step1.unusedA:
-
Unit tests - Add tests in
src/test/java - Integration tests - Test against real or mock APIs
-
Validation - Use the CLI tool:
naftiko validate capability.yaml - Docker testing - Build and run the Docker image with your capability
A: Naftiko requires Java 17 or later. This is specified in the Maven configuration.
A: Yes, use Mustache-style {{variable}} expressions:
externalRefs:
- name: env
type: environment
keys:
api_key: API_KEY
base_url: API_BASE_URL
consumes:
- baseUri: "{{base_url}}"
authentication:
type: apikey
key: X-API-Key
value: "{{api_key}}"Variables come from externalRefs and are injected at runtime.
A: Indirectly - by referencing the exposed URL/port as a consumed API:
# Capability B "consumes" the exposed endpoint from Capability A
consumes:
- baseUri: http://localhost:8081 # Capability A's port
namespace: capability-aThis way, Capability B can combine Capability A with other APIs.
A: Naftiko currently doesn't have built-in retry logic in v0.4. Options:
- At the HTTP client level - use an API gateway with retry policies
- In future versions - this is on the roadmap
Check the Roadmap for planned features.
A: Yes! Add multiple entries to exposes:
exposes:
- type: api
port: 8081
namespace: rest-api
resources: [...]
- type: mcp
port: 9091
namespace: mcp-server
tools: [...]
consumes: [...] # Shared between bothBoth adapters consume the same sources but expose different interfaces.
A: Naftiko is suitable for moderate to high loads depending on:
- Your consumed APIs' performance - Naftiko's overhead is minimal
- Docker/Kubernetes scaling - deploy multiple instances behind a load balancer
- Orchestration complexity - simpler capabilities (forward, single calls) are faster
For production workloads:
- Use Kubernetes for auto-scaling
- Monitor consuming/consumed API latencies
- Consider caching strategies above Naftiko
A:
-
Kubernetes (recommended):
apiVersion: apps/v1 kind: Deployment metadata: name: naftiko-engine spec: replicas: 3 template: spec: containers: - name: naftiko image: ghcr.io/naftiko/framework:v0.4 volumeMounts: - name: capability mountPath: /app/capability.yaml subPath: capability.yaml env: - name: GITHUB_TOKEN valueFrom: secretKeyRef: name: naftiko-secrets key: github-token
-
Docker Compose - for simpler setups
-
Environment Variables - inject secrets via
externalRefswithresolution: runtime
A: Yes, absolutely. Naftiko exposes standard HTTP endpoints, so it works with any reverse proxy.
Example (nginx):
server {
listen 80;
location / {
proxy_pass http://naftiko:8081;
}
}A: Naftiko is complementary to these specifications and combines their strengths into a single runtime model:
- Consume/expose duality - like OpenAPI's interface description, but bidirectional
- Orchestration - like Arazzo's workflow sequencing
- AI-driven discovery - beyond what all three cover natively
- Namespace-based routing - unique to Naftiko's runtime approach
See the Specification for a detailed comparison.
A: Yes, v0.4 is stable as of March 2026. The specification follows semantic versioning:
- Major versions (0.x.0) - breaking changes
- Minor versions (x.1.0) - new features, backward-compatible
- Patch versions (x.x.1) - bug fixes
Check the naftiko field in your YAML to specify the version.
A: Join the community at:
- GitHub Discussions - Ask questions and share ideas
- GitHub Issues - Report bugs or request features
- Pull Requests - Review and discuss code changes
A: Yes! Several resources:
- Tutorial - Step-by-step guides
- Use Cases - Real-world examples
-
Repository examples - In
src/main/resources/specs/and test resources - Specification examples - In the Specification (Section 4)
A: Check the Releases page for version history. The project follows a regular release cadence with security updates prioritized.
A:
- Read the Tutorial - particularly steps 2-5 on forwarding and orchestration
- Define consumed sources - GitHub and Notion APIs with auth
- Design exposed resources - endpoints that combine their data
- Use multi-step orchestration - call both APIs and map results
- Test locally - use Docker to run your capability
A:
-
Use
type: mcpinexposes -
Define
tools- each tool is an MCP tool your capability provides - Use stdio transport - for native Claude Desktop integration
- Test with Claude - configure Claude Desktop with your MCP server
- Publish - share your capability spec with the community
See the Tutorial Section 6 (MCP) for a full example.
A:
-
Consume multiple SaaS APIs - define each in
consumes -
Normalize outputs - use
outputParametersto extract and structure data consistently - Expose unified interface - create a single API with harmonized formats
- Use orchestration - combine data from multiple sources if needed
This is Naftiko's core strength for managing API sprawl.
- Specification - Complete technical reference
- Tutorial - Step-by-step learning guide
- Installation - Setup instructions
- Use Cases - Real-world examples
- Roadmap - Future plans
- Contribute - How to contribute
- Discussions - Community Q&A
Did this FAQ help you? Have questions not covered here?
- Add an issue - GitHub Issues
- Start a discussion - GitHub Discussions
- Submit a PR - Help us improve this FAQ!