The Models endpoint provides information about the available Swarm Inference models in the Fortytwo decentralized network. Each model represents AI capabilities running across distributed nodes.
List Models Endpoint
Retrieve a list of all available models that can be used for chat completions.
Endpoint
GET https://api.fortytwo.network/v1/models
Request
curl https://api.fortytwo.network/v1/models \
-H "Authorization: Bearer YOUR_FORTYTWO_API_KEY"
Response
The API returns a list of model objects in OpenAI-compatible format:
{
"object":"list",
"data": [
{
"id":"fortytwo-preview",
"object":"model",
"created":1760704134,
"owned_by":"fortytwo"
}
]
}
Response Fields
List of available model objects.
Always “list” for this endpoint.
Model Object Fields
Unique identifier for the model (use this in chat completions).
Organization that owns or operates the model.
Model permissions and capabilities.
Using Models
Once you’ve retrieved the list of available models, you can use any model ID in your ‘Chat Completions’ requests:
# List available models
response = requests.get(
"https://api.fortytwo.network/v1/models",
headers={"Authorization": f"Bearer {YOUR_FORTYTWO_API_KEY}"}
)
models = response.json()
# Use a specific model
model_id = models['data'][0]['id']
# Make a chat completion request
completion = requests.post(
"https://api.fortytwo.network/v1/chat/completions",
headers={"Authorization": f"Bearer {YOUR_FORTYTWO_API_KEY}"},
json={
"model": model_id,
"messages": [
{"role": "user", "content": "Hello!"}
]
}
)
OpenAI Compatibility
The Models endpoint is fully compatible with OpenAI’s models API format. You can use the same client libraries and code:
from openai import OpenAI
# Point to Fortytwo API
client = OpenAI(
api_key="YOUR_FORTYTWO_API_KEY",
base_url="https://api.fortytwo.network/v1"
)
# List models (same as OpenAI)
models = client.models.list()
for model in models.data:
print(f"Model: {model.id}")
Next Steps