API Belgeleri
Tektik AI API'sini entegre etmek için ihtiyacınız olan her şey.
Hızlı Başlangıç
3 adımda Tektik AI API'sini kullanmaya başlayın.
API Anahtarı Oluşturun
Hesabınıza giriş yapın ve Dashboard'da API Anahtarları bölümünden yeni bir anahtar oluşturun. Anahtarınız tkai_ ile başlayacaktır.
Bakiye Yükleyin
Dashboard'dan bakiye yükleyin. Minimum yükleme tutarı 250 TL'dir. Kredi kartı veya banka havalesi (EFT) ile ödeme yapabilirsiniz.
İlk İsteğinizi Gönderin
API anahtarınızı Authorization: Bearer header'ı ile gönderin.
curl -X POST https://tektik.ai/api/v1/services/translate \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"target_language": "English",
"text": "Çevirilecek Metin"
}
}'Kimlik Doğrulama
Tüm API istekleri Bearer token ile kimlik doğrulaması gerektirir.
API anahtarınızı her istekte Authorization header'ında gönderin:
Authorization: Bearer tkai_YOUR_API_KEYAPI Anahtarı Formatı
- Anahtarlar
tkai_ön eki ile başlar ve 40 karakter hex içerir. - Anahtarınız oluşturulduğunda yalnızca bir kez gösterilir. Güvenli bir yerde saklayın.
- Geçersiz veya süresi dolmuş anahtar kullanıldığında
401 AUTH_INVALID_KEYhatası döner.
Temel URL
https://tektik.ai/api/v1/Tüm endpoint'ler bu temel URL altındadır.
Servisler
AI servislerini listeleyin, detaylarını görün ve çalıştırın.
/api/v1/servicesTüm aktif servisleri listeler.
curl https://tektik.ai/api/v1/services \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Yanıt Örneği
[
{
"slug": "translate",
"nameTr": "Çeviri",
"nameEn": "Translation",
"descriptionTr": "Metin çeviri servisi",
"responseType": "json",
"requestSchema": [...],
"allowedModelIds": ["openai/gpt-4o", ...]
}
]/api/v1/services/:slugBir servisin detaylarını, kullanılabilir modellerini ve fiyatlarını döner.
curl https://tektik.ai/api/v1/services/translate \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Yanıt Örneği
{
"slug": "translate",
"nameTr": "Çeviri",
"requestSchema": [...],
"allowedModelIds": ["openai/gpt-4o"],
"models": [
{
"modelId": "openai/gpt-4o",
"displayName": "GPT-4o",
"inputPriceTl": 0.0154,
"outputPriceTl": 0.0462,
"contextWindow": 128000
}
]
}/api/v1/services/translateÇeviri
Parametreler
| Parametre | Tip | Zorunlu | Açıklama |
|---|---|---|---|
| model | string | Evet | Kullanılacak AI modeli |
| target_language | enum English | French | Arabic | Japanese | Spanish | Russian | Chinese | Evet | English |
| text | string | Evet | Çevirilecek Metin |
Kullanılabilir Modeller
| Model | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|
| openai/gpt-5.3-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.3-codex | 0,1051 TL | 0,8408 TL |
| x-ai/grok-4.1-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-code-fast-1 | 0,012 TL | 0,0901 TL |
| anthropic/claude-sonnet-4.6 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4.6 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-haiku-4.5 | 0,0601 TL | 0,3003 TL |
| ai21/jamba-large-1.7 | 0,1201 TL | 0,4805 TL |
| aion-labs/aion-1.0 | 0,2402 TL | 0,4805 TL |
| aion-labs/aion-1.0-mini | 0,042 TL | 0,0841 TL |
| aion-labs/aion-2.0 | 0,048 TL | 0,0961 TL |
| aion-labs/aion-rp-llama-3.1-8b | 0,048 TL | 0,0961 TL |
| alfredpros/codellama-7b-instruct-solidity | 0,048 TL | 0,0721 TL |
| allenai/olmo-2-0325-32b-instruct | 0,003 TL | 0,012 TL |
| allenai/olmo-3-32b-think | 0,009 TL | 0,03 TL |
| allenai/olmo-3.1-32b-instruct | 0,012 TL | 0,036 TL |
| allenai/olmo-3.1-32b-think | 0,009 TL | 0,03 TL |
| amazon/nova-2-lite-v1 | 0,018 TL | 0,1502 TL |
| amazon/nova-lite-v1 | 0,0036 TL | 0,0144 TL |
| amazon/nova-micro-v1 | 0,0021 TL | 0,0084 TL |
| amazon/nova-premier-v1 | 0,1502 TL | 0,7507 TL |
| amazon/nova-pro-v1 | 0,048 TL | 0,1922 TL |
| anthropic/claude-3-haiku | 0,015 TL | 0,0751 TL |
| anthropic/claude-3.5-haiku | 0,048 TL | 0,2402 TL |
| anthropic/claude-3.5-sonnet | 0,3604 TL | 1,8018 TL |
| anthropic/claude-3.7-sonnet | 0,1802 TL | 0,9009 TL |
| anthropic/claude-3.7-sonnet:thinking | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.1 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.5 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-sonnet-4 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-sonnet-4.5 | 0,1802 TL | 0,9009 TL |
| arcee-ai/coder-large | 0,03 TL | 0,048 TL |
| arcee-ai/maestro-reasoning | 0,0541 TL | 0,1982 TL |
| arcee-ai/spotlight | 0,0108 TL | 0,0108 TL |
| arcee-ai/trinity-mini | 0,0027 TL | 0,009 TL |
| arcee-ai/virtuoso-large | 0,045 TL | 0,0721 TL |
| openrouter/auto | -60.060,00 TL | -60.060,00 TL |
| baidu/ernie-4.5-21b-a3b | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-21b-a3b-thinking | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-300b-a47b | 0,0168 TL | 0,0661 TL |
| baidu/ernie-4.5-vl-28b-a3b | 0,0084 TL | 0,0336 TL |
| baidu/ernie-4.5-vl-424b-a47b | 0,0252 TL | 0,0751 TL |
| openrouter/bodybuilder | -60.060,00 TL | -60.060,00 TL |
| bytedance-seed/seed-1.6 | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-1.6-flash | 0,0045 TL | 0,018 TL |
| bytedance-seed/seed-2.0-lite | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-2.0-mini | 0,006 TL | 0,024 TL |
| bytedance/ui-tars-1.5-7b | 0,006 TL | 0,012 TL |
| cohere/command-a | 0,1502 TL | 0,6006 TL |
| cohere/command-r-08-2024 | 0,009 TL | 0,036 TL |
| cohere/command-r-plus-08-2024 | 0,1502 TL | 0,6006 TL |
| cohere/command-r7b-12-2024 | 0,0023 TL | 0,009 TL |
| deepcogito/cogito-v2.1-671b | 0,0751 TL | 0,0751 TL |
| deepseek/deepseek-chat | 0,0192 TL | 0,0535 TL |
| deepseek/deepseek-chat-v3-0324 | 0,012 TL | 0,0462 TL |
| deepseek/deepseek-chat-v3.1 | 0,009 TL | 0,045 TL |
| deepseek/deepseek-v3.1-terminus | 0,0126 TL | 0,0474 TL |
| deepseek/deepseek-v3.2 | 0,0156 TL | 0,0228 TL |
| deepseek/deepseek-v3.2-exp | 0,0162 TL | 0,0246 TL |
| deepseek/deepseek-v3.2-speciale | 0,024 TL | 0,0721 TL |
| deepseek/deepseek-r1 | 0,042 TL | 0,1502 TL |
| deepseek/deepseek-r1-0528 | 0,027 TL | 0,1291 TL |
| deepseek/deepseek-r1-distill-llama-70b | 0,042 TL | 0,048 TL |
| deepseek/deepseek-r1-distill-qwen-32b | 0,0174 TL | 0,0174 TL |
| eleutherai/llemma_7b | 0,048 TL | 0,0721 TL |
| essentialai/rnj-1-instruct | 0,009 TL | 0,009 TL |
| alpindale/goliath-120b | 0,2252 TL | 0,4505 TL |
| google/gemini-2.0-flash-001 | 0,006 TL | 0,024 TL |
| google/gemini-2.0-flash-lite-001 | 0,0045 TL | 0,018 TL |
| google/gemini-2.5-flash | 0,018 TL | 0,1502 TL |
| google/gemini-2.5-flash-lite | 0,006 TL | 0,024 TL |
| google/gemini-2.5-flash-lite-preview-09-2025 | 0,006 TL | 0,024 TL |
| google/gemini-2.5-pro | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview-05-06 | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview | 0,0751 TL | 0,6006 TL |
| google/gemini-3-flash-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3.1-flash-lite-preview | 0,015 TL | 0,0901 TL |
| google/gemini-3.1-pro-preview | 0,1201 TL | 0,7207 TL |
| google/gemini-3.1-pro-preview-customtools | 0,1201 TL | 0,7207 TL |
| google/gemma-2-27b-it | 0,039 TL | 0,039 TL |
| google/gemma-2-9b-it | 0,0018 TL | 0,0054 TL |
| google/gemma-3-12b-it | 0,0024 TL | 0,0078 TL |
| google/gemma-3-27b-it | 0,0048 TL | 0,0096 TL |
| google/gemma-3-4b-it | 0,0024 TL | 0,0048 TL |
| google/gemma-3n-e4b-it | 0,0012 TL | 0,0024 TL |
| google/gemini-2.5-flash-image | 0,018 TL | 0,1502 TL |
| google/gemini-3.1-flash-image-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3-pro-image-preview | 0,1201 TL | 0,7207 TL |
| ibm-granite/granite-4.0-h-micro | 0,001 TL | 0,0066 TL |
| inception/mercury | 0,015 TL | 0,045 TL |
| inception/mercury-2 | 0,015 TL | 0,045 TL |
| inception/mercury-coder | 0,015 TL | 0,045 TL |
| inflection/inflection-3-pi | 0,1502 TL | 0,6006 TL |
| inflection/inflection-3-productivity | 0,1502 TL | 0,6006 TL |
| kwaipilot/kat-coder-pro | 0,0124 TL | 0,0497 TL |
| kwaipilot/kat-coder-pro-v2 | 0,018 TL | 0,0721 TL |
| liquid/lfm-2.2-6b | 0,0006 TL | 0,0012 TL |
| liquid/lfm-2-24b-a2b | 0,0018 TL | 0,0072 TL |
| liquid/lfm2-8b-a1b | 0,0006 TL | 0,0012 TL |
| meta-llama/llama-guard-3-8b | 0,0012 TL | 0,0036 TL |
| anthracite-org/magnum-v4-72b | 0,1802 TL | 0,3003 TL |
| mancer/weaver | 0,045 TL | 0,0601 TL |
| meituan/longcat-flash-chat | 0,012 TL | 0,048 TL |
| meta-llama/llama-3-70b-instruct | 0,0306 TL | 0,0444 TL |
| meta-llama/llama-3-8b-instruct | 0,0018 TL | 0,0024 TL |
| meta-llama/llama-3.1-70b-instruct | 0,024 TL | 0,024 TL |
| meta-llama/llama-3.1-8b-instruct | 0,0012 TL | 0,003 TL |
| meta-llama/llama-3.2-11b-vision-instruct | 0,0029 TL | 0,0029 TL |
| meta-llama/llama-3.2-1b-instruct | 0,0016 TL | 0,012 TL |
| meta-llama/llama-3.2-3b-instruct | 0,0031 TL | 0,0204 TL |
| meta-llama/llama-3.3-70b-instruct | 0,006 TL | 0,0192 TL |
| meta-llama/llama-4-maverick | 0,009 TL | 0,036 TL |
| meta-llama/llama-4-scout | 0,0048 TL | 0,018 TL |
| meta-llama/llama-guard-4-12b | 0,0108 TL | 0,0108 TL |
| microsoft/phi-4 | 0,0039 TL | 0,0084 TL |
| minimax/minimax-m1 | 0,024 TL | 0,1321 TL |
| minimax/minimax-m2 | 0,0153 TL | 0,0601 TL |
| minimax/minimax-m2-her | 0,018 TL | 0,0721 TL |
| minimax/minimax-m2.1 | 0,0162 TL | 0,0571 TL |
| minimax/minimax-m2.5 | 0,0114 TL | 0,0691 TL |
| minimax/minimax-m2.7 | 0,018 TL | 0,0721 TL |
| minimax/minimax-01 | 0,012 TL | 0,0661 TL |
| mistralai/mistral-large | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2407 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/codestral-2508 | 0,018 TL | 0,0541 TL |
| mistralai/devstral-2512 | 0,024 TL | 0,1201 TL |
| mistralai/devstral-medium | 0,024 TL | 0,1201 TL |
| mistralai/devstral-small | 0,006 TL | 0,018 TL |
| mistralai/ministral-14b-2512 | 0,012 TL | 0,012 TL |
| mistralai/ministral-3b-2512 | 0,006 TL | 0,006 TL |
| mistralai/ministral-8b-2512 | 0,009 TL | 0,009 TL |
| mistralai/mistral-7b-instruct-v0.1 | 0,0066 TL | 0,0114 TL |
| mistralai/mistral-large-2512 | 0,03 TL | 0,0901 TL |
| mistralai/mistral-medium-3 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-medium-3.1 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-nemo | 0,0012 TL | 0,0024 TL |
| mistralai/mistral-small-24b-instruct-2501 | 0,003 TL | 0,0048 TL |
| mistralai/mistral-small-3.1-24b-instruct | 0,0018 TL | 0,0066 TL |
| mistralai/mistral-small-3.2-24b-instruct | 0,0045 TL | 0,012 TL |
| mistralai/mistral-small-2603 | 0,009 TL | 0,036 TL |
| mistralai/mistral-small-creative | 0,006 TL | 0,018 TL |
| mistralai/mixtral-8x22b-instruct | 0,1201 TL | 0,3604 TL |
| mistralai/mixtral-8x7b-instruct | 0,0324 TL | 0,0324 TL |
| mistralai/pixtral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-saba | 0,012 TL | 0,036 TL |
| mistralai/voxtral-small-24b-2507 | 0,006 TL | 0,018 TL |
| moonshotai/kimi-k2 | 0,0342 TL | 0,1381 TL |
| moonshotai/kimi-k2-0905 | 0,024 TL | 0,1201 TL |
| moonshotai/kimi-k2-thinking | 0,0282 TL | 0,1201 TL |
| moonshotai/kimi-k2.5 | 0,0252 TL | 0,1321 TL |
| morph/morph-v3-fast | 0,048 TL | 0,0721 TL |
| morph/morph-v3-large | 0,0541 TL | 0,1141 TL |
| gryphe/mythomax-l2-13b | 0,0036 TL | 0,0036 TL |
| nex-agi/deepseek-v3.1-nex-n1 | 0,0081 TL | 0,03 TL |
| nousresearch/hermes-3-llama-3.1-405b | 0,0601 TL | 0,0601 TL |
| nousresearch/hermes-3-llama-3.1-70b | 0,018 TL | 0,018 TL |
| nousresearch/hermes-4-405b | 0,0601 TL | 0,1802 TL |
| nousresearch/hermes-4-70b | 0,0078 TL | 0,024 TL |
| nousresearch/hermes-2-pro-llama-3-8b | 0,0084 TL | 0,0084 TL |
| nvidia/llama-3.1-nemotron-70b-instruct | 0,0721 TL | 0,0721 TL |
| nvidia/llama-3.1-nemotron-ultra-253b-v1 | 0,036 TL | 0,1081 TL |
| nvidia/llama-3.3-nemotron-super-49b-v1.5 | 0,006 TL | 0,024 TL |
| nvidia/nemotron-3-nano-30b-a3b | 0,003 TL | 0,012 TL |
| nvidia/nemotron-3-super-120b-a12b | 0,006 TL | 0,03 TL |
| nvidia/nemotron-nano-12b-v2-vl | 0,012 TL | 0,036 TL |
| nvidia/nemotron-nano-9b-v2 | 0,0024 TL | 0,0096 TL |
| openai/gpt-audio | 0,1502 TL | 0,6006 TL |
| openai/gpt-audio-mini | 0,036 TL | 0,1441 TL |
| openai/gpt-3.5-turbo | 0,03 TL | 0,0901 TL |
| openai/gpt-3.5-turbo-0613 | 0,0601 TL | 0,1201 TL |
| openai/gpt-3.5-turbo-16k | 0,1802 TL | 0,2402 TL |
| openai/gpt-3.5-turbo-instruct | 0,0901 TL | 0,1201 TL |
| openai/gpt-4 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-0314 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-turbo | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-1106-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-turbo-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4.1 | 0,1201 TL | 0,4805 TL |
| openai/gpt-4.1-mini | 0,024 TL | 0,0961 TL |
| openai/gpt-4.1-nano | 0,006 TL | 0,024 TL |
| openai/gpt-4o | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-05-13 | 0,3003 TL | 0,9009 TL |
| openai/gpt-4o-2024-08-06 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-11-20 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o:extended | 0,3604 TL | 1,0811 TL |
| openai/gpt-4o-audio-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-search-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-mini | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-2024-07-18 | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-search-preview | 0,009 TL | 0,036 TL |
| openai/gpt-5 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-image | 0,6006 TL | 0,6006 TL |
| openai/gpt-5-image-mini | 0,1502 TL | 0,1201 TL |
| openai/gpt-5-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5-nano | 0,003 TL | 0,024 TL |
| openai/gpt-5-pro | 0,9009 TL | 7,2072 TL |
| openai/gpt-5.1 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-max | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5.2 | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-pro | 1,2613 TL | 10,0901 TL |
| openai/gpt-5.2-codex | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.4 | 0,1502 TL | 0,9009 TL |
| openai/gpt-5.4-mini | 0,045 TL | 0,2703 TL |
| openai/gpt-5.4-nano | 0,012 TL | 0,0751 TL |
| openai/gpt-5.4-pro | 1,8018 TL | 10,8108 TL |
| openai/gpt-oss-120b | 0,0023 TL | 0,0114 TL |
| openai/gpt-oss-20b | 0,0018 TL | 0,0066 TL |
| openai/gpt-oss-safeguard-20b | 0,0045 TL | 0,018 TL |
| openai/o1 | 0,9009 TL | 3,6036 TL |
| openai/o1-pro | 9,009 TL | 36,036 TL |
| openai/o3 | 0,1201 TL | 0,4805 TL |
| openai/o3-deep-research | 0,6006 TL | 2,4024 TL |
| openai/o3-mini | 0,0661 TL | 0,2643 TL |
| openai/o3-mini-high | 0,0661 TL | 0,2643 TL |
| openai/o3-pro | 1,2012 TL | 4,8048 TL |
| openai/o4-mini | 0,0661 TL | 0,2643 TL |
| openai/o4-mini-deep-research | 0,1201 TL | 0,4805 TL |
| openai/o4-mini-high | 0,0661 TL | 0,2643 TL |
| perplexity/sonar | 0,0601 TL | 0,0601 TL |
| perplexity/sonar-deep-research | 0,1201 TL | 0,4805 TL |
| perplexity/sonar-pro | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-pro-search | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-reasoning-pro | 0,1201 TL | 0,4805 TL |
| prime-intellect/intellect-3 | 0,012 TL | 0,0661 TL |
| qwen/qwen-plus-2025-07-28 | 0,0156 TL | 0,0468 TL |
| qwen/qwen-plus-2025-07-28:thinking | 0,0156 TL | 0,0468 TL |
| qwen/qwen-vl-max | 0,0312 TL | 0,1249 TL |
| qwen/qwen-vl-plus | 0,0082 TL | 0,0246 TL |
| qwen/qwen-max | 0,0625 TL | 0,2498 TL |
| qwen/qwen-plus | 0,0156 TL | 0,0468 TL |
| qwen/qwen-turbo | 0,002 TL | 0,0078 TL |
| qwen/qwen-2.5-7b-instruct | 0,0024 TL | 0,006 TL |
| qwen/qwen2.5-coder-7b-instruct | 0,0018 TL | 0,0054 TL |
| qwen/qwen2.5-vl-32b-instruct | 0,012 TL | 0,036 TL |
| qwen/qwen2.5-vl-72b-instruct | 0,048 TL | 0,048 TL |
| qwen/qwen3-14b | 0,0036 TL | 0,0144 TL |
| qwen/qwen3-235b-a22b | 0,0273 TL | 0,1093 TL |
| qwen/qwen3-235b-a22b-2507 | 0,0043 TL | 0,006 TL |
| qwen/qwen3-235b-a22b-thinking-2507 | 0,009 TL | 0,0898 TL |
| qwen/qwen3-30b-a3b | 0,0048 TL | 0,0168 TL |
| qwen/qwen3-30b-a3b-instruct-2507 | 0,0054 TL | 0,018 TL |
| qwen/qwen3-30b-a3b-thinking-2507 | 0,0048 TL | 0,024 TL |
| qwen/qwen3-32b | 0,0048 TL | 0,0144 TL |
| qwen/qwen3-8b | 0,003 TL | 0,024 TL |
| qwen/qwen3-coder-30b-a3b-instruct | 0,0042 TL | 0,0162 TL |
| qwen/qwen3-coder | 0,0132 TL | 0,0601 TL |
| qwen/qwen3-coder-flash | 0,0117 TL | 0,0586 TL |
| qwen/qwen3-coder-next | 0,0072 TL | 0,045 TL |
| qwen/qwen3-coder-plus | 0,039 TL | 0,1952 TL |
| qwen/qwen3-max | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-max-thinking | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-next-80b-a3b-instruct | 0,0054 TL | 0,0661 TL |
| qwen/qwen3-next-80b-a3b-thinking | 0,0059 TL | 0,0468 TL |
| qwen/qwen3-vl-235b-a22b-instruct | 0,012 TL | 0,0529 TL |
| qwen/qwen3-vl-235b-a22b-thinking | 0,0156 TL | 0,1562 TL |
| qwen/qwen3-vl-30b-a3b-instruct | 0,0078 TL | 0,0312 TL |
| qwen/qwen3-vl-30b-a3b-thinking | 0,0078 TL | 0,0937 TL |
| qwen/qwen3-vl-32b-instruct | 0,0062 TL | 0,025 TL |
| qwen/qwen3-vl-8b-instruct | 0,0048 TL | 0,03 TL |
| qwen/qwen3-vl-8b-thinking | 0,007 TL | 0,082 TL |
| qwen/qwen3.5-397b-a17b | 0,0234 TL | 0,1405 TL |
| qwen/qwen3.5-plus-02-15 | 0,0156 TL | 0,0937 TL |
| qwen/qwen3.5-122b-a10b | 0,0156 TL | 0,1249 TL |
| qwen/qwen3.5-27b | 0,0117 TL | 0,0937 TL |
| qwen/qwen3.5-35b-a3b | 0,0098 TL | 0,0781 TL |
| qwen/qwen3.5-9b | 0,003 TL | 0,009 TL |
| qwen/qwen3.5-flash-02-23 | 0,0039 TL | 0,0156 TL |
| qwen/qwq-32b | 0,009 TL | 0,0348 TL |
| qwen/qwen-2.5-72b-instruct | 0,0072 TL | 0,0234 TL |
| qwen/qwen-2.5-coder-32b-instruct | 0,0396 TL | 0,0601 TL |
| reka/reka-edge | 0,006 TL | 0,006 TL |
| relace/relace-apply-3 | 0,0511 TL | 0,0751 TL |
| relace/relace-search | 0,0601 TL | 0,1802 TL |
| undi95/remm-slerp-l2-13b | 0,027 TL | 0,039 TL |
| sao10k/l3-lunaris-8b | 0,0024 TL | 0,003 TL |
| sao10k/l3-euryale-70b | 0,0889 TL | 0,0889 TL |
| sao10k/l3.1-70b-hanami-x1 | 0,1802 TL | 0,1802 TL |
| sao10k/l3.1-euryale-70b | 0,0511 TL | 0,0511 TL |
| sao10k/l3.3-euryale-70b | 0,039 TL | 0,045 TL |
| stepfun/step-3.5-flash | 0,006 TL | 0,018 TL |
| switchpoint/router | 0,0511 TL | 0,2042 TL |
| tencent/hunyuan-a13b-instruct | 0,0084 TL | 0,0342 TL |
| thedrummer/cydonia-24b-v4.1 | 0,018 TL | 0,03 TL |
| thedrummer/rocinante-12b | 0,0102 TL | 0,0258 TL |
| thedrummer/skyfall-36b-v2 | 0,033 TL | 0,048 TL |
| thedrummer/unslopnemo-12b | 0,024 TL | 0,024 TL |
| tngtech/deepseek-r1t2-chimera | 0,018 TL | 0,0661 TL |
| alibaba/tongyi-deepresearch-30b-a3b | 0,0054 TL | 0,027 TL |
| upstage/solar-pro-3 | 0,009 TL | 0,036 TL |
| microsoft/wizardlm-2-8x22b | 0,0372 TL | 0,0372 TL |
| writer/palmyra-x5 | 0,036 TL | 0,3604 TL |
| x-ai/grok-3 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-beta | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-mini | 0,018 TL | 0,03 TL |
| x-ai/grok-3-mini-beta | 0,018 TL | 0,03 TL |
| x-ai/grok-4 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-4-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-4.20-beta | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent-beta | 0,1201 TL | 0,3604 TL |
| xiaomi/mimo-v2-flash | 0,0054 TL | 0,0174 TL |
| xiaomi/mimo-v2-omni | 0,024 TL | 0,1201 TL |
| xiaomi/mimo-v2-pro | 0,0601 TL | 0,1802 TL |
| z-ai/glm-4-32b | 0,006 TL | 0,006 TL |
| z-ai/glm-4.5 | 0,036 TL | 0,1321 TL |
| z-ai/glm-4.5-air | 0,0078 TL | 0,0511 TL |
| z-ai/glm-4.5v | 0,036 TL | 0,1081 TL |
| z-ai/glm-4.6 | 0,0234 TL | 0,1141 TL |
| z-ai/glm-4.6v | 0,018 TL | 0,0541 TL |
| z-ai/glm-4.7 | 0,0234 TL | 0,1051 TL |
| z-ai/glm-4.7-flash | 0,0036 TL | 0,024 TL |
| z-ai/glm-5 | 0,0432 TL | 0,1381 TL |
| z-ai/glm-5-turbo | 0,0721 TL | 0,2402 TL |
| anthropic/claude-opus-4.6-fast | 1,8018 TL | 9,009 TL |
| anthropic/claude-opus-4.7 | 0,3003 TL | 1,5015 TL |
| arcee-ai/trinity-large-thinking | 0,0132 TL | 0,0511 TL |
| google/gemma-4-26b-a4b-it | 0,0042 TL | 0,021 TL |
| google/gemma-4-31b-it | 0,0078 TL | 0,0228 TL |
| moonshotai/kimi-k2.6 | 0,036 TL | 0,1682 TL |
| qwen/qwen3.6-plus | 0,0195 TL | 0,1171 TL |
| rekaai/reka-edge | 0,006 TL | 0,006 TL |
| rekaai/reka-flash-3 | 0,006 TL | 0,012 TL |
| x-ai/grok-4.20 | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent | 0,1201 TL | 0,3604 TL |
| z-ai/glm-5.1 | 0,0419 TL | 0,2643 TL |
| z-ai/glm-5v-turbo | 0,0721 TL | 0,2402 TL |
curl -X POST https://tektik.ai/api/v1/services/translate \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"target_language": "English",
"text": "Çevirilecek Metin"
}
}'Yanıt Örneği
{
"response": "...",
"costTl": 0.0234
}/api/v1/services/seo-metaSeo Meta Generator
Parametreler
| Parametre | Tip | Zorunlu | Açıklama |
|---|---|---|---|
| model | string | Evet | Kullanılacak AI modeli |
| title | string | Evet | İçerik başlığı |
| content | string | Evet | İçerik metni (ilk 3000 karakter) |
| locale | enum tr | en | fr | ar | zh | es | ru | Hayır | Hedef dil (varsayılan: tr) |
| keywords | string | Hayır | Mevcut anahtar kelimeler (varsa) |
Kullanılabilir Modeller
| Model | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|
| openai/gpt-5.3-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.3-codex | 0,1051 TL | 0,8408 TL |
| x-ai/grok-4.1-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-code-fast-1 | 0,012 TL | 0,0901 TL |
| anthropic/claude-sonnet-4.6 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4.6 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-haiku-4.5 | 0,0601 TL | 0,3003 TL |
| ai21/jamba-large-1.7 | 0,1201 TL | 0,4805 TL |
| aion-labs/aion-1.0 | 0,2402 TL | 0,4805 TL |
| aion-labs/aion-1.0-mini | 0,042 TL | 0,0841 TL |
| aion-labs/aion-2.0 | 0,048 TL | 0,0961 TL |
| aion-labs/aion-rp-llama-3.1-8b | 0,048 TL | 0,0961 TL |
| alfredpros/codellama-7b-instruct-solidity | 0,048 TL | 0,0721 TL |
| allenai/olmo-2-0325-32b-instruct | 0,003 TL | 0,012 TL |
| allenai/olmo-3-32b-think | 0,009 TL | 0,03 TL |
| allenai/olmo-3.1-32b-instruct | 0,012 TL | 0,036 TL |
| allenai/olmo-3.1-32b-think | 0,009 TL | 0,03 TL |
| amazon/nova-2-lite-v1 | 0,018 TL | 0,1502 TL |
| amazon/nova-lite-v1 | 0,0036 TL | 0,0144 TL |
| amazon/nova-micro-v1 | 0,0021 TL | 0,0084 TL |
| amazon/nova-premier-v1 | 0,1502 TL | 0,7507 TL |
| amazon/nova-pro-v1 | 0,048 TL | 0,1922 TL |
| anthropic/claude-3-haiku | 0,015 TL | 0,0751 TL |
| anthropic/claude-3.5-haiku | 0,048 TL | 0,2402 TL |
| anthropic/claude-3.5-sonnet | 0,3604 TL | 1,8018 TL |
| anthropic/claude-3.7-sonnet | 0,1802 TL | 0,9009 TL |
| anthropic/claude-3.7-sonnet:thinking | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.1 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.5 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-sonnet-4 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-sonnet-4.5 | 0,1802 TL | 0,9009 TL |
| arcee-ai/coder-large | 0,03 TL | 0,048 TL |
| arcee-ai/maestro-reasoning | 0,0541 TL | 0,1982 TL |
| arcee-ai/spotlight | 0,0108 TL | 0,0108 TL |
| arcee-ai/trinity-mini | 0,0027 TL | 0,009 TL |
| arcee-ai/virtuoso-large | 0,045 TL | 0,0721 TL |
| openrouter/auto | -60.060,00 TL | -60.060,00 TL |
| baidu/ernie-4.5-21b-a3b | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-21b-a3b-thinking | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-300b-a47b | 0,0168 TL | 0,0661 TL |
| baidu/ernie-4.5-vl-28b-a3b | 0,0084 TL | 0,0336 TL |
| baidu/ernie-4.5-vl-424b-a47b | 0,0252 TL | 0,0751 TL |
| openrouter/bodybuilder | -60.060,00 TL | -60.060,00 TL |
| bytedance-seed/seed-1.6 | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-1.6-flash | 0,0045 TL | 0,018 TL |
| bytedance-seed/seed-2.0-lite | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-2.0-mini | 0,006 TL | 0,024 TL |
| bytedance/ui-tars-1.5-7b | 0,006 TL | 0,012 TL |
| cohere/command-a | 0,1502 TL | 0,6006 TL |
| cohere/command-r-08-2024 | 0,009 TL | 0,036 TL |
| cohere/command-r-plus-08-2024 | 0,1502 TL | 0,6006 TL |
| cohere/command-r7b-12-2024 | 0,0023 TL | 0,009 TL |
| deepcogito/cogito-v2.1-671b | 0,0751 TL | 0,0751 TL |
| deepseek/deepseek-chat | 0,0192 TL | 0,0535 TL |
| deepseek/deepseek-chat-v3-0324 | 0,012 TL | 0,0462 TL |
| deepseek/deepseek-chat-v3.1 | 0,009 TL | 0,045 TL |
| deepseek/deepseek-v3.1-terminus | 0,0126 TL | 0,0474 TL |
| deepseek/deepseek-v3.2 | 0,0156 TL | 0,0228 TL |
| deepseek/deepseek-v3.2-exp | 0,0162 TL | 0,0246 TL |
| deepseek/deepseek-v3.2-speciale | 0,024 TL | 0,0721 TL |
| deepseek/deepseek-r1 | 0,042 TL | 0,1502 TL |
| deepseek/deepseek-r1-0528 | 0,027 TL | 0,1291 TL |
| deepseek/deepseek-r1-distill-llama-70b | 0,042 TL | 0,048 TL |
| deepseek/deepseek-r1-distill-qwen-32b | 0,0174 TL | 0,0174 TL |
| eleutherai/llemma_7b | 0,048 TL | 0,0721 TL |
| essentialai/rnj-1-instruct | 0,009 TL | 0,009 TL |
| alpindale/goliath-120b | 0,2252 TL | 0,4505 TL |
| google/gemini-2.0-flash-001 | 0,006 TL | 0,024 TL |
| google/gemini-2.0-flash-lite-001 | 0,0045 TL | 0,018 TL |
| google/gemini-2.5-flash | 0,018 TL | 0,1502 TL |
| google/gemini-2.5-flash-lite | 0,006 TL | 0,024 TL |
| google/gemini-2.5-flash-lite-preview-09-2025 | 0,006 TL | 0,024 TL |
| google/gemini-2.5-pro | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview-05-06 | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview | 0,0751 TL | 0,6006 TL |
| google/gemini-3-flash-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3.1-flash-lite-preview | 0,015 TL | 0,0901 TL |
| google/gemini-3.1-pro-preview | 0,1201 TL | 0,7207 TL |
| google/gemini-3.1-pro-preview-customtools | 0,1201 TL | 0,7207 TL |
| google/gemma-2-27b-it | 0,039 TL | 0,039 TL |
| google/gemma-2-9b-it | 0,0018 TL | 0,0054 TL |
| google/gemma-3-12b-it | 0,0024 TL | 0,0078 TL |
| google/gemma-3-27b-it | 0,0048 TL | 0,0096 TL |
| google/gemma-3-4b-it | 0,0024 TL | 0,0048 TL |
| google/gemma-3n-e4b-it | 0,0012 TL | 0,0024 TL |
| google/gemini-2.5-flash-image | 0,018 TL | 0,1502 TL |
| google/gemini-3.1-flash-image-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3-pro-image-preview | 0,1201 TL | 0,7207 TL |
| ibm-granite/granite-4.0-h-micro | 0,001 TL | 0,0066 TL |
| inception/mercury | 0,015 TL | 0,045 TL |
| inception/mercury-2 | 0,015 TL | 0,045 TL |
| inception/mercury-coder | 0,015 TL | 0,045 TL |
| inflection/inflection-3-pi | 0,1502 TL | 0,6006 TL |
| inflection/inflection-3-productivity | 0,1502 TL | 0,6006 TL |
| kwaipilot/kat-coder-pro | 0,0124 TL | 0,0497 TL |
| kwaipilot/kat-coder-pro-v2 | 0,018 TL | 0,0721 TL |
| liquid/lfm-2.2-6b | 0,0006 TL | 0,0012 TL |
| liquid/lfm-2-24b-a2b | 0,0018 TL | 0,0072 TL |
| liquid/lfm2-8b-a1b | 0,0006 TL | 0,0012 TL |
| meta-llama/llama-guard-3-8b | 0,0012 TL | 0,0036 TL |
| anthracite-org/magnum-v4-72b | 0,1802 TL | 0,3003 TL |
| mancer/weaver | 0,045 TL | 0,0601 TL |
| meituan/longcat-flash-chat | 0,012 TL | 0,048 TL |
| meta-llama/llama-3-70b-instruct | 0,0306 TL | 0,0444 TL |
| meta-llama/llama-3-8b-instruct | 0,0018 TL | 0,0024 TL |
| meta-llama/llama-3.1-70b-instruct | 0,024 TL | 0,024 TL |
| meta-llama/llama-3.1-8b-instruct | 0,0012 TL | 0,003 TL |
| meta-llama/llama-3.2-11b-vision-instruct | 0,0029 TL | 0,0029 TL |
| meta-llama/llama-3.2-1b-instruct | 0,0016 TL | 0,012 TL |
| meta-llama/llama-3.2-3b-instruct | 0,0031 TL | 0,0204 TL |
| meta-llama/llama-3.3-70b-instruct | 0,006 TL | 0,0192 TL |
| meta-llama/llama-4-maverick | 0,009 TL | 0,036 TL |
| meta-llama/llama-4-scout | 0,0048 TL | 0,018 TL |
| meta-llama/llama-guard-4-12b | 0,0108 TL | 0,0108 TL |
| microsoft/phi-4 | 0,0039 TL | 0,0084 TL |
| minimax/minimax-m1 | 0,024 TL | 0,1321 TL |
| minimax/minimax-m2 | 0,0153 TL | 0,0601 TL |
| minimax/minimax-m2-her | 0,018 TL | 0,0721 TL |
| minimax/minimax-m2.1 | 0,0162 TL | 0,0571 TL |
| minimax/minimax-m2.5 | 0,0114 TL | 0,0691 TL |
| minimax/minimax-m2.7 | 0,018 TL | 0,0721 TL |
| minimax/minimax-01 | 0,012 TL | 0,0661 TL |
| mistralai/mistral-large | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2407 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/codestral-2508 | 0,018 TL | 0,0541 TL |
| mistralai/devstral-2512 | 0,024 TL | 0,1201 TL |
| mistralai/devstral-medium | 0,024 TL | 0,1201 TL |
| mistralai/devstral-small | 0,006 TL | 0,018 TL |
| mistralai/ministral-14b-2512 | 0,012 TL | 0,012 TL |
| mistralai/ministral-3b-2512 | 0,006 TL | 0,006 TL |
| mistralai/ministral-8b-2512 | 0,009 TL | 0,009 TL |
| mistralai/mistral-7b-instruct-v0.1 | 0,0066 TL | 0,0114 TL |
| mistralai/mistral-large-2512 | 0,03 TL | 0,0901 TL |
| mistralai/mistral-medium-3 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-medium-3.1 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-nemo | 0,0012 TL | 0,0024 TL |
| mistralai/mistral-small-24b-instruct-2501 | 0,003 TL | 0,0048 TL |
| mistralai/mistral-small-3.1-24b-instruct | 0,0018 TL | 0,0066 TL |
| mistralai/mistral-small-3.2-24b-instruct | 0,0045 TL | 0,012 TL |
| mistralai/mistral-small-2603 | 0,009 TL | 0,036 TL |
| mistralai/mistral-small-creative | 0,006 TL | 0,018 TL |
| mistralai/mixtral-8x22b-instruct | 0,1201 TL | 0,3604 TL |
| mistralai/mixtral-8x7b-instruct | 0,0324 TL | 0,0324 TL |
| mistralai/pixtral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-saba | 0,012 TL | 0,036 TL |
| mistralai/voxtral-small-24b-2507 | 0,006 TL | 0,018 TL |
| moonshotai/kimi-k2 | 0,0342 TL | 0,1381 TL |
| moonshotai/kimi-k2-0905 | 0,024 TL | 0,1201 TL |
| moonshotai/kimi-k2-thinking | 0,0282 TL | 0,1201 TL |
| moonshotai/kimi-k2.5 | 0,0252 TL | 0,1321 TL |
| morph/morph-v3-fast | 0,048 TL | 0,0721 TL |
| morph/morph-v3-large | 0,0541 TL | 0,1141 TL |
| gryphe/mythomax-l2-13b | 0,0036 TL | 0,0036 TL |
| nex-agi/deepseek-v3.1-nex-n1 | 0,0081 TL | 0,03 TL |
| nousresearch/hermes-3-llama-3.1-405b | 0,0601 TL | 0,0601 TL |
| nousresearch/hermes-3-llama-3.1-70b | 0,018 TL | 0,018 TL |
| nousresearch/hermes-4-405b | 0,0601 TL | 0,1802 TL |
| nousresearch/hermes-4-70b | 0,0078 TL | 0,024 TL |
| nousresearch/hermes-2-pro-llama-3-8b | 0,0084 TL | 0,0084 TL |
| nvidia/llama-3.1-nemotron-70b-instruct | 0,0721 TL | 0,0721 TL |
| nvidia/llama-3.1-nemotron-ultra-253b-v1 | 0,036 TL | 0,1081 TL |
| nvidia/llama-3.3-nemotron-super-49b-v1.5 | 0,006 TL | 0,024 TL |
| nvidia/nemotron-3-nano-30b-a3b | 0,003 TL | 0,012 TL |
| nvidia/nemotron-3-super-120b-a12b | 0,006 TL | 0,03 TL |
| nvidia/nemotron-nano-12b-v2-vl | 0,012 TL | 0,036 TL |
| nvidia/nemotron-nano-9b-v2 | 0,0024 TL | 0,0096 TL |
| openai/gpt-audio | 0,1502 TL | 0,6006 TL |
| openai/gpt-audio-mini | 0,036 TL | 0,1441 TL |
| openai/gpt-3.5-turbo | 0,03 TL | 0,0901 TL |
| openai/gpt-3.5-turbo-0613 | 0,0601 TL | 0,1201 TL |
| openai/gpt-3.5-turbo-16k | 0,1802 TL | 0,2402 TL |
| openai/gpt-3.5-turbo-instruct | 0,0901 TL | 0,1201 TL |
| openai/gpt-4 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-0314 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-turbo | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-1106-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-turbo-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4.1 | 0,1201 TL | 0,4805 TL |
| openai/gpt-4.1-mini | 0,024 TL | 0,0961 TL |
| openai/gpt-4.1-nano | 0,006 TL | 0,024 TL |
| openai/gpt-4o | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-05-13 | 0,3003 TL | 0,9009 TL |
| openai/gpt-4o-2024-08-06 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-11-20 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o:extended | 0,3604 TL | 1,0811 TL |
| openai/gpt-4o-audio-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-search-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-mini | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-2024-07-18 | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-search-preview | 0,009 TL | 0,036 TL |
| openai/gpt-5 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-image | 0,6006 TL | 0,6006 TL |
| openai/gpt-5-image-mini | 0,1502 TL | 0,1201 TL |
| openai/gpt-5-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5-nano | 0,003 TL | 0,024 TL |
| openai/gpt-5-pro | 0,9009 TL | 7,2072 TL |
| openai/gpt-5.1 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-max | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5.2 | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-pro | 1,2613 TL | 10,0901 TL |
| openai/gpt-5.2-codex | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.4 | 0,1502 TL | 0,9009 TL |
| openai/gpt-5.4-mini | 0,045 TL | 0,2703 TL |
| openai/gpt-5.4-nano | 0,012 TL | 0,0751 TL |
| openai/gpt-5.4-pro | 1,8018 TL | 10,8108 TL |
| openai/gpt-oss-120b | 0,0023 TL | 0,0114 TL |
| openai/gpt-oss-20b | 0,0018 TL | 0,0066 TL |
| openai/gpt-oss-safeguard-20b | 0,0045 TL | 0,018 TL |
| openai/o1 | 0,9009 TL | 3,6036 TL |
| openai/o1-pro | 9,009 TL | 36,036 TL |
| openai/o3 | 0,1201 TL | 0,4805 TL |
| openai/o3-deep-research | 0,6006 TL | 2,4024 TL |
| openai/o3-mini | 0,0661 TL | 0,2643 TL |
| openai/o3-mini-high | 0,0661 TL | 0,2643 TL |
| openai/o3-pro | 1,2012 TL | 4,8048 TL |
| openai/o4-mini | 0,0661 TL | 0,2643 TL |
| openai/o4-mini-deep-research | 0,1201 TL | 0,4805 TL |
| openai/o4-mini-high | 0,0661 TL | 0,2643 TL |
| perplexity/sonar | 0,0601 TL | 0,0601 TL |
| perplexity/sonar-deep-research | 0,1201 TL | 0,4805 TL |
| perplexity/sonar-pro | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-pro-search | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-reasoning-pro | 0,1201 TL | 0,4805 TL |
| prime-intellect/intellect-3 | 0,012 TL | 0,0661 TL |
| qwen/qwen-plus-2025-07-28 | 0,0156 TL | 0,0468 TL |
| qwen/qwen-plus-2025-07-28:thinking | 0,0156 TL | 0,0468 TL |
| qwen/qwen-vl-max | 0,0312 TL | 0,1249 TL |
| qwen/qwen-vl-plus | 0,0082 TL | 0,0246 TL |
| qwen/qwen-max | 0,0625 TL | 0,2498 TL |
| qwen/qwen-plus | 0,0156 TL | 0,0468 TL |
| qwen/qwen-turbo | 0,002 TL | 0,0078 TL |
| qwen/qwen-2.5-7b-instruct | 0,0024 TL | 0,006 TL |
| qwen/qwen2.5-coder-7b-instruct | 0,0018 TL | 0,0054 TL |
| qwen/qwen2.5-vl-32b-instruct | 0,012 TL | 0,036 TL |
| qwen/qwen2.5-vl-72b-instruct | 0,048 TL | 0,048 TL |
| qwen/qwen3-14b | 0,0036 TL | 0,0144 TL |
| qwen/qwen3-235b-a22b | 0,0273 TL | 0,1093 TL |
| qwen/qwen3-235b-a22b-2507 | 0,0043 TL | 0,006 TL |
| qwen/qwen3-235b-a22b-thinking-2507 | 0,009 TL | 0,0898 TL |
| qwen/qwen3-30b-a3b | 0,0048 TL | 0,0168 TL |
| qwen/qwen3-30b-a3b-instruct-2507 | 0,0054 TL | 0,018 TL |
| qwen/qwen3-30b-a3b-thinking-2507 | 0,0048 TL | 0,024 TL |
| qwen/qwen3-32b | 0,0048 TL | 0,0144 TL |
| qwen/qwen3-8b | 0,003 TL | 0,024 TL |
| qwen/qwen3-coder-30b-a3b-instruct | 0,0042 TL | 0,0162 TL |
| qwen/qwen3-coder | 0,0132 TL | 0,0601 TL |
| qwen/qwen3-coder-flash | 0,0117 TL | 0,0586 TL |
| qwen/qwen3-coder-next | 0,0072 TL | 0,045 TL |
| qwen/qwen3-coder-plus | 0,039 TL | 0,1952 TL |
| qwen/qwen3-max | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-max-thinking | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-next-80b-a3b-instruct | 0,0054 TL | 0,0661 TL |
| qwen/qwen3-next-80b-a3b-thinking | 0,0059 TL | 0,0468 TL |
| qwen/qwen3-vl-235b-a22b-instruct | 0,012 TL | 0,0529 TL |
| qwen/qwen3-vl-235b-a22b-thinking | 0,0156 TL | 0,1562 TL |
| qwen/qwen3-vl-30b-a3b-instruct | 0,0078 TL | 0,0312 TL |
| qwen/qwen3-vl-30b-a3b-thinking | 0,0078 TL | 0,0937 TL |
| qwen/qwen3-vl-32b-instruct | 0,0062 TL | 0,025 TL |
| qwen/qwen3-vl-8b-instruct | 0,0048 TL | 0,03 TL |
| qwen/qwen3-vl-8b-thinking | 0,007 TL | 0,082 TL |
| qwen/qwen3.5-397b-a17b | 0,0234 TL | 0,1405 TL |
| qwen/qwen3.5-plus-02-15 | 0,0156 TL | 0,0937 TL |
| qwen/qwen3.5-122b-a10b | 0,0156 TL | 0,1249 TL |
| qwen/qwen3.5-27b | 0,0117 TL | 0,0937 TL |
| qwen/qwen3.5-35b-a3b | 0,0098 TL | 0,0781 TL |
| qwen/qwen3.5-9b | 0,003 TL | 0,009 TL |
| qwen/qwen3.5-flash-02-23 | 0,0039 TL | 0,0156 TL |
| qwen/qwq-32b | 0,009 TL | 0,0348 TL |
| qwen/qwen-2.5-72b-instruct | 0,0072 TL | 0,0234 TL |
| qwen/qwen-2.5-coder-32b-instruct | 0,0396 TL | 0,0601 TL |
| reka/reka-edge | 0,006 TL | 0,006 TL |
| relace/relace-apply-3 | 0,0511 TL | 0,0751 TL |
| relace/relace-search | 0,0601 TL | 0,1802 TL |
| undi95/remm-slerp-l2-13b | 0,027 TL | 0,039 TL |
| sao10k/l3-lunaris-8b | 0,0024 TL | 0,003 TL |
| sao10k/l3-euryale-70b | 0,0889 TL | 0,0889 TL |
| sao10k/l3.1-70b-hanami-x1 | 0,1802 TL | 0,1802 TL |
| sao10k/l3.1-euryale-70b | 0,0511 TL | 0,0511 TL |
| sao10k/l3.3-euryale-70b | 0,039 TL | 0,045 TL |
| stepfun/step-3.5-flash | 0,006 TL | 0,018 TL |
| switchpoint/router | 0,0511 TL | 0,2042 TL |
| tencent/hunyuan-a13b-instruct | 0,0084 TL | 0,0342 TL |
| thedrummer/cydonia-24b-v4.1 | 0,018 TL | 0,03 TL |
| thedrummer/rocinante-12b | 0,0102 TL | 0,0258 TL |
| thedrummer/skyfall-36b-v2 | 0,033 TL | 0,048 TL |
| thedrummer/unslopnemo-12b | 0,024 TL | 0,024 TL |
| tngtech/deepseek-r1t2-chimera | 0,018 TL | 0,0661 TL |
| alibaba/tongyi-deepresearch-30b-a3b | 0,0054 TL | 0,027 TL |
| upstage/solar-pro-3 | 0,009 TL | 0,036 TL |
| microsoft/wizardlm-2-8x22b | 0,0372 TL | 0,0372 TL |
| writer/palmyra-x5 | 0,036 TL | 0,3604 TL |
| x-ai/grok-3 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-beta | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-mini | 0,018 TL | 0,03 TL |
| x-ai/grok-3-mini-beta | 0,018 TL | 0,03 TL |
| x-ai/grok-4 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-4-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-4.20-beta | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent-beta | 0,1201 TL | 0,3604 TL |
| xiaomi/mimo-v2-flash | 0,0054 TL | 0,0174 TL |
| xiaomi/mimo-v2-omni | 0,024 TL | 0,1201 TL |
| xiaomi/mimo-v2-pro | 0,0601 TL | 0,1802 TL |
| z-ai/glm-4-32b | 0,006 TL | 0,006 TL |
| z-ai/glm-4.5 | 0,036 TL | 0,1321 TL |
| z-ai/glm-4.5-air | 0,0078 TL | 0,0511 TL |
| z-ai/glm-4.5v | 0,036 TL | 0,1081 TL |
| z-ai/glm-4.6 | 0,0234 TL | 0,1141 TL |
| z-ai/glm-4.6v | 0,018 TL | 0,0541 TL |
| z-ai/glm-4.7 | 0,0234 TL | 0,1051 TL |
| z-ai/glm-4.7-flash | 0,0036 TL | 0,024 TL |
| z-ai/glm-5 | 0,0432 TL | 0,1381 TL |
| z-ai/glm-5-turbo | 0,0721 TL | 0,2402 TL |
curl -X POST https://tektik.ai/api/v1/services/seo-meta \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"title": "İçerik başlığı",
"content": "İçerik metni (ilk 3000 karakter)",
"locale": "Hedef dil (varsayılan: tr)",
"keywords": "Mevcut anahtar kelimeler (varsa)"
}
}'Yanıt Örneği
{
"response": {
"metaTitle": "...",
"metaDescription": "...",
"metaKeywords": "...",
"ogTitle": "...",
"ogDescription": "...",
"twitterCard": "summary_large_image"
},
"costTl": 0.0234
}/api/v1/services/media-seoMedya SEO Oluşturucu
Parametreler
| Parametre | Tip | Zorunlu | Açıklama |
|---|---|---|---|
| model | string | Evet | Kullanılacak AI modeli |
| media_url | media_url | Evet | https://example.com/image.jpg |
| context | string | Hayır | Kullanılan yer |
| locale | enum English | French | Arabic | Japanese | Hayır | English |
Kullanılabilir Modeller
| Model | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|
| openai/gpt-5.3-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.3-codex | 0,1051 TL | 0,8408 TL |
| x-ai/grok-4.1-fast | 0,012 TL | 0,03 TL |
| anthropic/claude-sonnet-4.6 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4.6 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-haiku-4.5 | 0,0601 TL | 0,3003 TL |
| amazon/nova-2-lite-v1 | 0,018 TL | 0,1502 TL |
| amazon/nova-lite-v1 | 0,0036 TL | 0,0144 TL |
| amazon/nova-premier-v1 | 0,1502 TL | 0,7507 TL |
| amazon/nova-pro-v1 | 0,048 TL | 0,1922 TL |
| anthropic/claude-3-haiku | 0,015 TL | 0,0751 TL |
| anthropic/claude-3.5-haiku | 0,048 TL | 0,2402 TL |
| anthropic/claude-3.5-sonnet | 0,3604 TL | 1,8018 TL |
| anthropic/claude-3.7-sonnet | 0,1802 TL | 0,9009 TL |
| anthropic/claude-3.7-sonnet:thinking | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.1 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.5 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-sonnet-4 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-sonnet-4.5 | 0,1802 TL | 0,9009 TL |
| arcee-ai/spotlight | 0,0108 TL | 0,0108 TL |
| openrouter/auto | -60.060,00 TL | -60.060,00 TL |
| baidu/ernie-4.5-vl-28b-a3b | 0,0084 TL | 0,0336 TL |
| baidu/ernie-4.5-vl-424b-a47b | 0,0252 TL | 0,0751 TL |
| bytedance-seed/seed-1.6 | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-1.6-flash | 0,0045 TL | 0,018 TL |
| bytedance-seed/seed-2.0-lite | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-2.0-mini | 0,006 TL | 0,024 TL |
| bytedance/ui-tars-1.5-7b | 0,006 TL | 0,012 TL |
| google/gemini-2.0-flash-001 | 0,006 TL | 0,024 TL |
| google/gemini-2.0-flash-lite-001 | 0,0045 TL | 0,018 TL |
| google/gemini-2.5-flash | 0,018 TL | 0,1502 TL |
| google/gemini-2.5-flash-lite | 0,006 TL | 0,024 TL |
| google/gemini-2.5-flash-lite-preview-09-2025 | 0,006 TL | 0,024 TL |
| google/gemini-2.5-pro | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview-05-06 | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview | 0,0751 TL | 0,6006 TL |
| google/gemini-3-flash-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3.1-flash-lite-preview | 0,015 TL | 0,0901 TL |
| google/gemini-3.1-pro-preview | 0,1201 TL | 0,7207 TL |
| google/gemini-3.1-pro-preview-customtools | 0,1201 TL | 0,7207 TL |
| google/gemma-3-12b-it | 0,0024 TL | 0,0078 TL |
| google/gemma-3-27b-it | 0,0048 TL | 0,0096 TL |
| google/gemma-3-4b-it | 0,0024 TL | 0,0048 TL |
| google/gemini-2.5-flash-image | 0,018 TL | 0,1502 TL |
| google/gemini-3.1-flash-image-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3-pro-image-preview | 0,1201 TL | 0,7207 TL |
| meta-llama/llama-3.2-11b-vision-instruct | 0,0029 TL | 0,0029 TL |
| meta-llama/llama-4-maverick | 0,009 TL | 0,036 TL |
| meta-llama/llama-4-scout | 0,0048 TL | 0,018 TL |
| meta-llama/llama-guard-4-12b | 0,0108 TL | 0,0108 TL |
| minimax/minimax-01 | 0,012 TL | 0,0661 TL |
| mistralai/ministral-14b-2512 | 0,012 TL | 0,012 TL |
| mistralai/ministral-3b-2512 | 0,006 TL | 0,006 TL |
| mistralai/ministral-8b-2512 | 0,009 TL | 0,009 TL |
| mistralai/mistral-large-2512 | 0,03 TL | 0,0901 TL |
| mistralai/mistral-medium-3 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-medium-3.1 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-small-3.1-24b-instruct | 0,0018 TL | 0,0066 TL |
| mistralai/mistral-small-3.2-24b-instruct | 0,0045 TL | 0,012 TL |
| mistralai/mistral-small-2603 | 0,009 TL | 0,036 TL |
| mistralai/pixtral-large-2411 | 0,1201 TL | 0,3604 TL |
| moonshotai/kimi-k2.5 | 0,0252 TL | 0,1321 TL |
| nvidia/nemotron-nano-12b-v2-vl | 0,012 TL | 0,036 TL |
| openai/gpt-4-turbo | 0,6006 TL | 1,8018 TL |
| openai/gpt-4.1 | 0,1201 TL | 0,4805 TL |
| openai/gpt-4.1-mini | 0,024 TL | 0,0961 TL |
| openai/gpt-4.1-nano | 0,006 TL | 0,024 TL |
| openai/gpt-4o | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-05-13 | 0,3003 TL | 0,9009 TL |
| openai/gpt-4o-2024-08-06 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-11-20 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o:extended | 0,3604 TL | 1,0811 TL |
| openai/gpt-4o-mini | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-2024-07-18 | 0,009 TL | 0,036 TL |
| openai/gpt-5 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-image | 0,6006 TL | 0,6006 TL |
| openai/gpt-5-image-mini | 0,1502 TL | 0,1201 TL |
| openai/gpt-5-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5-nano | 0,003 TL | 0,024 TL |
| openai/gpt-5-pro | 0,9009 TL | 7,2072 TL |
| openai/gpt-5.1 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-max | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5.2 | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-pro | 1,2613 TL | 10,0901 TL |
| openai/gpt-5.2-codex | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.4 | 0,1502 TL | 0,9009 TL |
| openai/gpt-5.4-mini | 0,045 TL | 0,2703 TL |
| openai/gpt-5.4-nano | 0,012 TL | 0,0751 TL |
| openai/gpt-5.4-pro | 1,8018 TL | 10,8108 TL |
| openai/o1 | 0,9009 TL | 3,6036 TL |
| openai/o1-pro | 9,009 TL | 36,036 TL |
| openai/o3 | 0,1201 TL | 0,4805 TL |
| openai/o3-deep-research | 0,6006 TL | 2,4024 TL |
| openai/o3-pro | 1,2012 TL | 4,8048 TL |
| openai/o4-mini | 0,0661 TL | 0,2643 TL |
| openai/o4-mini-deep-research | 0,1201 TL | 0,4805 TL |
| openai/o4-mini-high | 0,0661 TL | 0,2643 TL |
| perplexity/sonar | 0,0601 TL | 0,0601 TL |
| perplexity/sonar-pro | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-pro-search | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-reasoning-pro | 0,1201 TL | 0,4805 TL |
| qwen/qwen-vl-max | 0,0312 TL | 0,1249 TL |
| qwen/qwen-vl-plus | 0,0082 TL | 0,0246 TL |
| qwen/qwen2.5-vl-32b-instruct | 0,012 TL | 0,036 TL |
| qwen/qwen2.5-vl-72b-instruct | 0,048 TL | 0,048 TL |
| qwen/qwen3-vl-235b-a22b-instruct | 0,012 TL | 0,0529 TL |
| qwen/qwen3-vl-235b-a22b-thinking | 0,0156 TL | 0,1562 TL |
| qwen/qwen3-vl-30b-a3b-instruct | 0,0078 TL | 0,0312 TL |
| qwen/qwen3-vl-30b-a3b-thinking | 0,0078 TL | 0,0937 TL |
| qwen/qwen3-vl-32b-instruct | 0,0062 TL | 0,025 TL |
| qwen/qwen3-vl-8b-instruct | 0,0048 TL | 0,03 TL |
| qwen/qwen3-vl-8b-thinking | 0,007 TL | 0,082 TL |
| qwen/qwen3.5-397b-a17b | 0,0234 TL | 0,1405 TL |
| qwen/qwen3.5-plus-02-15 | 0,0156 TL | 0,0937 TL |
| qwen/qwen3.5-122b-a10b | 0,0156 TL | 0,1249 TL |
| qwen/qwen3.5-27b | 0,0117 TL | 0,0937 TL |
| qwen/qwen3.5-35b-a3b | 0,0098 TL | 0,0781 TL |
| qwen/qwen3.5-9b | 0,003 TL | 0,009 TL |
| qwen/qwen3.5-flash-02-23 | 0,0039 TL | 0,0156 TL |
| reka/reka-edge | 0,006 TL | 0,006 TL |
| x-ai/grok-4 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-4-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-4.20-beta | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent-beta | 0,1201 TL | 0,3604 TL |
| xiaomi/mimo-v2-omni | 0,024 TL | 0,1201 TL |
| z-ai/glm-4.5v | 0,036 TL | 0,1081 TL |
| z-ai/glm-4.6v | 0,018 TL | 0,0541 TL |
curl -X POST https://tektik.ai/api/v1/services/media-seo \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"media_url": "https://example.com/image.jpg",
"context": "Kullanılan yer",
"locale": "English"
}
}'Yanıt Örneği
{
"response": {
"title": "...",
"description": "...",
"altText": "...",
"caption": "..."
},
"costTl": 0.0234
}/api/v1/services/summarizeÖzet oluştur
Parametreler
| Parametre | Tip | Zorunlu | Açıklama |
|---|---|---|---|
| model | string | Evet | Kullanılacak AI modeli |
| text | string | Evet | Özetlenecek alan |
| length | number | Hayır | Ozet boyutu |
| language | string | Hayır | Ozet dili |
Kullanılabilir Modeller
| Model | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|
| openai/gpt-5.3-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.3-codex | 0,1051 TL | 0,8408 TL |
| x-ai/grok-4.1-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-code-fast-1 | 0,012 TL | 0,0901 TL |
| anthropic/claude-sonnet-4.6 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4.6 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-haiku-4.5 | 0,0601 TL | 0,3003 TL |
| ai21/jamba-large-1.7 | 0,1201 TL | 0,4805 TL |
| aion-labs/aion-1.0 | 0,2402 TL | 0,4805 TL |
| aion-labs/aion-1.0-mini | 0,042 TL | 0,0841 TL |
| aion-labs/aion-2.0 | 0,048 TL | 0,0961 TL |
| aion-labs/aion-rp-llama-3.1-8b | 0,048 TL | 0,0961 TL |
| alfredpros/codellama-7b-instruct-solidity | 0,048 TL | 0,0721 TL |
| allenai/olmo-2-0325-32b-instruct | 0,003 TL | 0,012 TL |
| allenai/olmo-3-32b-think | 0,009 TL | 0,03 TL |
| allenai/olmo-3.1-32b-instruct | 0,012 TL | 0,036 TL |
| allenai/olmo-3.1-32b-think | 0,009 TL | 0,03 TL |
| amazon/nova-2-lite-v1 | 0,018 TL | 0,1502 TL |
| amazon/nova-lite-v1 | 0,0036 TL | 0,0144 TL |
| amazon/nova-micro-v1 | 0,0021 TL | 0,0084 TL |
| amazon/nova-premier-v1 | 0,1502 TL | 0,7507 TL |
| amazon/nova-pro-v1 | 0,048 TL | 0,1922 TL |
| anthropic/claude-3-haiku | 0,015 TL | 0,0751 TL |
| anthropic/claude-3.5-haiku | 0,048 TL | 0,2402 TL |
| anthropic/claude-3.5-sonnet | 0,3604 TL | 1,8018 TL |
| anthropic/claude-3.7-sonnet | 0,1802 TL | 0,9009 TL |
| anthropic/claude-3.7-sonnet:thinking | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.1 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.5 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-sonnet-4 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-sonnet-4.5 | 0,1802 TL | 0,9009 TL |
| arcee-ai/coder-large | 0,03 TL | 0,048 TL |
| arcee-ai/maestro-reasoning | 0,0541 TL | 0,1982 TL |
| arcee-ai/spotlight | 0,0108 TL | 0,0108 TL |
| arcee-ai/trinity-mini | 0,0027 TL | 0,009 TL |
| arcee-ai/virtuoso-large | 0,045 TL | 0,0721 TL |
| openrouter/auto | -60.060,00 TL | -60.060,00 TL |
| baidu/ernie-4.5-21b-a3b | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-21b-a3b-thinking | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-300b-a47b | 0,0168 TL | 0,0661 TL |
| baidu/ernie-4.5-vl-28b-a3b | 0,0084 TL | 0,0336 TL |
| baidu/ernie-4.5-vl-424b-a47b | 0,0252 TL | 0,0751 TL |
| openrouter/bodybuilder | -60.060,00 TL | -60.060,00 TL |
| bytedance-seed/seed-1.6 | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-1.6-flash | 0,0045 TL | 0,018 TL |
| bytedance-seed/seed-2.0-lite | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-2.0-mini | 0,006 TL | 0,024 TL |
| bytedance/ui-tars-1.5-7b | 0,006 TL | 0,012 TL |
| cohere/command-a | 0,1502 TL | 0,6006 TL |
| cohere/command-r-08-2024 | 0,009 TL | 0,036 TL |
| cohere/command-r-plus-08-2024 | 0,1502 TL | 0,6006 TL |
| cohere/command-r7b-12-2024 | 0,0023 TL | 0,009 TL |
| deepcogito/cogito-v2.1-671b | 0,0751 TL | 0,0751 TL |
| deepseek/deepseek-chat | 0,0192 TL | 0,0535 TL |
| deepseek/deepseek-chat-v3-0324 | 0,012 TL | 0,0462 TL |
| deepseek/deepseek-chat-v3.1 | 0,009 TL | 0,045 TL |
| deepseek/deepseek-v3.1-terminus | 0,0126 TL | 0,0474 TL |
| deepseek/deepseek-v3.2 | 0,0156 TL | 0,0228 TL |
| deepseek/deepseek-v3.2-exp | 0,0162 TL | 0,0246 TL |
| deepseek/deepseek-v3.2-speciale | 0,024 TL | 0,0721 TL |
| deepseek/deepseek-r1 | 0,042 TL | 0,1502 TL |
| deepseek/deepseek-r1-0528 | 0,027 TL | 0,1291 TL |
| deepseek/deepseek-r1-distill-llama-70b | 0,042 TL | 0,048 TL |
| deepseek/deepseek-r1-distill-qwen-32b | 0,0174 TL | 0,0174 TL |
| eleutherai/llemma_7b | 0,048 TL | 0,0721 TL |
| essentialai/rnj-1-instruct | 0,009 TL | 0,009 TL |
| alpindale/goliath-120b | 0,2252 TL | 0,4505 TL |
| google/gemini-2.0-flash-001 | 0,006 TL | 0,024 TL |
| google/gemini-2.0-flash-lite-001 | 0,0045 TL | 0,018 TL |
| google/gemini-2.5-flash | 0,018 TL | 0,1502 TL |
| google/gemini-2.5-flash-lite | 0,006 TL | 0,024 TL |
| google/gemini-2.5-flash-lite-preview-09-2025 | 0,006 TL | 0,024 TL |
| google/gemini-2.5-pro | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview-05-06 | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview | 0,0751 TL | 0,6006 TL |
| google/gemini-3-flash-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3.1-flash-lite-preview | 0,015 TL | 0,0901 TL |
| google/gemini-3.1-pro-preview | 0,1201 TL | 0,7207 TL |
| google/gemini-3.1-pro-preview-customtools | 0,1201 TL | 0,7207 TL |
| google/gemma-2-27b-it | 0,039 TL | 0,039 TL |
| google/gemma-2-9b-it | 0,0018 TL | 0,0054 TL |
| google/gemma-3-12b-it | 0,0024 TL | 0,0078 TL |
| google/gemma-3-27b-it | 0,0048 TL | 0,0096 TL |
| google/gemma-3-4b-it | 0,0024 TL | 0,0048 TL |
| google/gemma-3n-e4b-it | 0,0012 TL | 0,0024 TL |
| google/gemini-2.5-flash-image | 0,018 TL | 0,1502 TL |
| google/gemini-3.1-flash-image-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3-pro-image-preview | 0,1201 TL | 0,7207 TL |
| ibm-granite/granite-4.0-h-micro | 0,001 TL | 0,0066 TL |
| inception/mercury | 0,015 TL | 0,045 TL |
| inception/mercury-2 | 0,015 TL | 0,045 TL |
| inception/mercury-coder | 0,015 TL | 0,045 TL |
| inflection/inflection-3-pi | 0,1502 TL | 0,6006 TL |
| inflection/inflection-3-productivity | 0,1502 TL | 0,6006 TL |
| kwaipilot/kat-coder-pro | 0,0124 TL | 0,0497 TL |
| kwaipilot/kat-coder-pro-v2 | 0,018 TL | 0,0721 TL |
| liquid/lfm-2.2-6b | 0,0006 TL | 0,0012 TL |
| liquid/lfm-2-24b-a2b | 0,0018 TL | 0,0072 TL |
| liquid/lfm2-8b-a1b | 0,0006 TL | 0,0012 TL |
| meta-llama/llama-guard-3-8b | 0,0012 TL | 0,0036 TL |
| anthracite-org/magnum-v4-72b | 0,1802 TL | 0,3003 TL |
| mancer/weaver | 0,045 TL | 0,0601 TL |
| meituan/longcat-flash-chat | 0,012 TL | 0,048 TL |
| meta-llama/llama-3-70b-instruct | 0,0306 TL | 0,0444 TL |
| meta-llama/llama-3-8b-instruct | 0,0018 TL | 0,0024 TL |
| meta-llama/llama-3.1-70b-instruct | 0,024 TL | 0,024 TL |
| meta-llama/llama-3.1-8b-instruct | 0,0012 TL | 0,003 TL |
| meta-llama/llama-3.2-11b-vision-instruct | 0,0029 TL | 0,0029 TL |
| meta-llama/llama-3.2-1b-instruct | 0,0016 TL | 0,012 TL |
| meta-llama/llama-3.2-3b-instruct | 0,0031 TL | 0,0204 TL |
| meta-llama/llama-3.3-70b-instruct | 0,006 TL | 0,0192 TL |
| meta-llama/llama-4-maverick | 0,009 TL | 0,036 TL |
| meta-llama/llama-4-scout | 0,0048 TL | 0,018 TL |
| meta-llama/llama-guard-4-12b | 0,0108 TL | 0,0108 TL |
| microsoft/phi-4 | 0,0039 TL | 0,0084 TL |
| minimax/minimax-m1 | 0,024 TL | 0,1321 TL |
| minimax/minimax-m2 | 0,0153 TL | 0,0601 TL |
| minimax/minimax-m2-her | 0,018 TL | 0,0721 TL |
| minimax/minimax-m2.1 | 0,0162 TL | 0,0571 TL |
| minimax/minimax-m2.5 | 0,0114 TL | 0,0691 TL |
| minimax/minimax-m2.7 | 0,018 TL | 0,0721 TL |
| minimax/minimax-01 | 0,012 TL | 0,0661 TL |
| mistralai/mistral-large | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2407 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/codestral-2508 | 0,018 TL | 0,0541 TL |
| mistralai/devstral-2512 | 0,024 TL | 0,1201 TL |
| mistralai/devstral-medium | 0,024 TL | 0,1201 TL |
| mistralai/devstral-small | 0,006 TL | 0,018 TL |
| mistralai/ministral-14b-2512 | 0,012 TL | 0,012 TL |
| mistralai/ministral-3b-2512 | 0,006 TL | 0,006 TL |
| mistralai/ministral-8b-2512 | 0,009 TL | 0,009 TL |
| mistralai/mistral-7b-instruct-v0.1 | 0,0066 TL | 0,0114 TL |
| mistralai/mistral-large-2512 | 0,03 TL | 0,0901 TL |
| mistralai/mistral-medium-3 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-medium-3.1 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-nemo | 0,0012 TL | 0,0024 TL |
| mistralai/mistral-small-24b-instruct-2501 | 0,003 TL | 0,0048 TL |
| mistralai/mistral-small-3.1-24b-instruct | 0,0018 TL | 0,0066 TL |
| mistralai/mistral-small-3.2-24b-instruct | 0,0045 TL | 0,012 TL |
| mistralai/mistral-small-2603 | 0,009 TL | 0,036 TL |
| mistralai/mistral-small-creative | 0,006 TL | 0,018 TL |
| mistralai/mixtral-8x22b-instruct | 0,1201 TL | 0,3604 TL |
| mistralai/mixtral-8x7b-instruct | 0,0324 TL | 0,0324 TL |
| mistralai/pixtral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-saba | 0,012 TL | 0,036 TL |
| mistralai/voxtral-small-24b-2507 | 0,006 TL | 0,018 TL |
| moonshotai/kimi-k2 | 0,0342 TL | 0,1381 TL |
| moonshotai/kimi-k2-0905 | 0,024 TL | 0,1201 TL |
| moonshotai/kimi-k2-thinking | 0,0282 TL | 0,1201 TL |
| moonshotai/kimi-k2.5 | 0,0252 TL | 0,1321 TL |
| morph/morph-v3-fast | 0,048 TL | 0,0721 TL |
| morph/morph-v3-large | 0,0541 TL | 0,1141 TL |
| gryphe/mythomax-l2-13b | 0,0036 TL | 0,0036 TL |
| nex-agi/deepseek-v3.1-nex-n1 | 0,0081 TL | 0,03 TL |
| nousresearch/hermes-3-llama-3.1-405b | 0,0601 TL | 0,0601 TL |
| nousresearch/hermes-3-llama-3.1-70b | 0,018 TL | 0,018 TL |
| nousresearch/hermes-4-405b | 0,0601 TL | 0,1802 TL |
| nousresearch/hermes-4-70b | 0,0078 TL | 0,024 TL |
| nousresearch/hermes-2-pro-llama-3-8b | 0,0084 TL | 0,0084 TL |
| nvidia/llama-3.1-nemotron-70b-instruct | 0,0721 TL | 0,0721 TL |
| nvidia/llama-3.1-nemotron-ultra-253b-v1 | 0,036 TL | 0,1081 TL |
| nvidia/llama-3.3-nemotron-super-49b-v1.5 | 0,006 TL | 0,024 TL |
| nvidia/nemotron-3-nano-30b-a3b | 0,003 TL | 0,012 TL |
| nvidia/nemotron-3-super-120b-a12b | 0,006 TL | 0,03 TL |
| nvidia/nemotron-nano-12b-v2-vl | 0,012 TL | 0,036 TL |
| nvidia/nemotron-nano-9b-v2 | 0,0024 TL | 0,0096 TL |
| openai/gpt-audio | 0,1502 TL | 0,6006 TL |
| openai/gpt-audio-mini | 0,036 TL | 0,1441 TL |
| openai/gpt-3.5-turbo | 0,03 TL | 0,0901 TL |
| openai/gpt-3.5-turbo-0613 | 0,0601 TL | 0,1201 TL |
| openai/gpt-3.5-turbo-16k | 0,1802 TL | 0,2402 TL |
| openai/gpt-3.5-turbo-instruct | 0,0901 TL | 0,1201 TL |
| openai/gpt-4 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-0314 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-turbo | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-1106-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-turbo-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4.1 | 0,1201 TL | 0,4805 TL |
| openai/gpt-4.1-mini | 0,024 TL | 0,0961 TL |
| openai/gpt-4.1-nano | 0,006 TL | 0,024 TL |
| openai/gpt-4o | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-05-13 | 0,3003 TL | 0,9009 TL |
| openai/gpt-4o-2024-08-06 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-11-20 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o:extended | 0,3604 TL | 1,0811 TL |
| openai/gpt-4o-audio-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-search-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-mini | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-2024-07-18 | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-search-preview | 0,009 TL | 0,036 TL |
| openai/gpt-5 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-image | 0,6006 TL | 0,6006 TL |
| openai/gpt-5-image-mini | 0,1502 TL | 0,1201 TL |
| openai/gpt-5-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5-nano | 0,003 TL | 0,024 TL |
| openai/gpt-5-pro | 0,9009 TL | 7,2072 TL |
| openai/gpt-5.1 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-max | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5.2 | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-pro | 1,2613 TL | 10,0901 TL |
| openai/gpt-5.2-codex | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.4 | 0,1502 TL | 0,9009 TL |
| openai/gpt-5.4-mini | 0,045 TL | 0,2703 TL |
| openai/gpt-5.4-nano | 0,012 TL | 0,0751 TL |
| openai/gpt-5.4-pro | 1,8018 TL | 10,8108 TL |
| openai/gpt-oss-120b | 0,0023 TL | 0,0114 TL |
| openai/gpt-oss-20b | 0,0018 TL | 0,0066 TL |
| openai/gpt-oss-safeguard-20b | 0,0045 TL | 0,018 TL |
| openai/o1 | 0,9009 TL | 3,6036 TL |
| openai/o1-pro | 9,009 TL | 36,036 TL |
| openai/o3 | 0,1201 TL | 0,4805 TL |
| openai/o3-deep-research | 0,6006 TL | 2,4024 TL |
| openai/o3-mini | 0,0661 TL | 0,2643 TL |
| openai/o3-mini-high | 0,0661 TL | 0,2643 TL |
| openai/o3-pro | 1,2012 TL | 4,8048 TL |
| openai/o4-mini | 0,0661 TL | 0,2643 TL |
| openai/o4-mini-deep-research | 0,1201 TL | 0,4805 TL |
| openai/o4-mini-high | 0,0661 TL | 0,2643 TL |
| perplexity/sonar | 0,0601 TL | 0,0601 TL |
| perplexity/sonar-deep-research | 0,1201 TL | 0,4805 TL |
| perplexity/sonar-pro | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-pro-search | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-reasoning-pro | 0,1201 TL | 0,4805 TL |
| prime-intellect/intellect-3 | 0,012 TL | 0,0661 TL |
| qwen/qwen-plus-2025-07-28 | 0,0156 TL | 0,0468 TL |
| qwen/qwen-plus-2025-07-28:thinking | 0,0156 TL | 0,0468 TL |
| qwen/qwen-vl-max | 0,0312 TL | 0,1249 TL |
| qwen/qwen-vl-plus | 0,0082 TL | 0,0246 TL |
| qwen/qwen-max | 0,0625 TL | 0,2498 TL |
| qwen/qwen-plus | 0,0156 TL | 0,0468 TL |
| qwen/qwen-turbo | 0,002 TL | 0,0078 TL |
| qwen/qwen-2.5-7b-instruct | 0,0024 TL | 0,006 TL |
| qwen/qwen2.5-coder-7b-instruct | 0,0018 TL | 0,0054 TL |
| qwen/qwen2.5-vl-32b-instruct | 0,012 TL | 0,036 TL |
| qwen/qwen2.5-vl-72b-instruct | 0,048 TL | 0,048 TL |
| qwen/qwen3-14b | 0,0036 TL | 0,0144 TL |
| qwen/qwen3-235b-a22b | 0,0273 TL | 0,1093 TL |
| qwen/qwen3-235b-a22b-2507 | 0,0043 TL | 0,006 TL |
| qwen/qwen3-235b-a22b-thinking-2507 | 0,009 TL | 0,0898 TL |
| qwen/qwen3-30b-a3b | 0,0048 TL | 0,0168 TL |
| qwen/qwen3-30b-a3b-instruct-2507 | 0,0054 TL | 0,018 TL |
| qwen/qwen3-30b-a3b-thinking-2507 | 0,0048 TL | 0,024 TL |
| qwen/qwen3-32b | 0,0048 TL | 0,0144 TL |
| qwen/qwen3-8b | 0,003 TL | 0,024 TL |
| qwen/qwen3-coder-30b-a3b-instruct | 0,0042 TL | 0,0162 TL |
| qwen/qwen3-coder | 0,0132 TL | 0,0601 TL |
| qwen/qwen3-coder-flash | 0,0117 TL | 0,0586 TL |
| qwen/qwen3-coder-next | 0,0072 TL | 0,045 TL |
| qwen/qwen3-coder-plus | 0,039 TL | 0,1952 TL |
| qwen/qwen3-max | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-max-thinking | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-next-80b-a3b-instruct | 0,0054 TL | 0,0661 TL |
| qwen/qwen3-next-80b-a3b-thinking | 0,0059 TL | 0,0468 TL |
| qwen/qwen3-vl-235b-a22b-instruct | 0,012 TL | 0,0529 TL |
| qwen/qwen3-vl-235b-a22b-thinking | 0,0156 TL | 0,1562 TL |
| qwen/qwen3-vl-30b-a3b-instruct | 0,0078 TL | 0,0312 TL |
| qwen/qwen3-vl-30b-a3b-thinking | 0,0078 TL | 0,0937 TL |
| qwen/qwen3-vl-32b-instruct | 0,0062 TL | 0,025 TL |
| qwen/qwen3-vl-8b-instruct | 0,0048 TL | 0,03 TL |
| qwen/qwen3-vl-8b-thinking | 0,007 TL | 0,082 TL |
| qwen/qwen3.5-397b-a17b | 0,0234 TL | 0,1405 TL |
| qwen/qwen3.5-plus-02-15 | 0,0156 TL | 0,0937 TL |
| qwen/qwen3.5-122b-a10b | 0,0156 TL | 0,1249 TL |
| qwen/qwen3.5-27b | 0,0117 TL | 0,0937 TL |
| qwen/qwen3.5-35b-a3b | 0,0098 TL | 0,0781 TL |
| qwen/qwen3.5-9b | 0,003 TL | 0,009 TL |
| qwen/qwen3.5-flash-02-23 | 0,0039 TL | 0,0156 TL |
| qwen/qwq-32b | 0,009 TL | 0,0348 TL |
| qwen/qwen-2.5-72b-instruct | 0,0072 TL | 0,0234 TL |
| qwen/qwen-2.5-coder-32b-instruct | 0,0396 TL | 0,0601 TL |
| reka/reka-edge | 0,006 TL | 0,006 TL |
| relace/relace-apply-3 | 0,0511 TL | 0,0751 TL |
| relace/relace-search | 0,0601 TL | 0,1802 TL |
| undi95/remm-slerp-l2-13b | 0,027 TL | 0,039 TL |
| sao10k/l3-lunaris-8b | 0,0024 TL | 0,003 TL |
| sao10k/l3-euryale-70b | 0,0889 TL | 0,0889 TL |
| sao10k/l3.1-70b-hanami-x1 | 0,1802 TL | 0,1802 TL |
| sao10k/l3.1-euryale-70b | 0,0511 TL | 0,0511 TL |
| sao10k/l3.3-euryale-70b | 0,039 TL | 0,045 TL |
| stepfun/step-3.5-flash | 0,006 TL | 0,018 TL |
| switchpoint/router | 0,0511 TL | 0,2042 TL |
| tencent/hunyuan-a13b-instruct | 0,0084 TL | 0,0342 TL |
| thedrummer/cydonia-24b-v4.1 | 0,018 TL | 0,03 TL |
| thedrummer/rocinante-12b | 0,0102 TL | 0,0258 TL |
| thedrummer/skyfall-36b-v2 | 0,033 TL | 0,048 TL |
| thedrummer/unslopnemo-12b | 0,024 TL | 0,024 TL |
| tngtech/deepseek-r1t2-chimera | 0,018 TL | 0,0661 TL |
| alibaba/tongyi-deepresearch-30b-a3b | 0,0054 TL | 0,027 TL |
| upstage/solar-pro-3 | 0,009 TL | 0,036 TL |
| microsoft/wizardlm-2-8x22b | 0,0372 TL | 0,0372 TL |
| writer/palmyra-x5 | 0,036 TL | 0,3604 TL |
| x-ai/grok-3 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-beta | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-mini | 0,018 TL | 0,03 TL |
| x-ai/grok-3-mini-beta | 0,018 TL | 0,03 TL |
| x-ai/grok-4 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-4-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-4.20-beta | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent-beta | 0,1201 TL | 0,3604 TL |
| xiaomi/mimo-v2-flash | 0,0054 TL | 0,0174 TL |
| xiaomi/mimo-v2-omni | 0,024 TL | 0,1201 TL |
| xiaomi/mimo-v2-pro | 0,0601 TL | 0,1802 TL |
| z-ai/glm-4-32b | 0,006 TL | 0,006 TL |
| z-ai/glm-4.5 | 0,036 TL | 0,1321 TL |
| z-ai/glm-4.5-air | 0,0078 TL | 0,0511 TL |
| z-ai/glm-4.5v | 0,036 TL | 0,1081 TL |
| z-ai/glm-4.6 | 0,0234 TL | 0,1141 TL |
| z-ai/glm-4.6v | 0,018 TL | 0,0541 TL |
| z-ai/glm-4.7 | 0,0234 TL | 0,1051 TL |
| z-ai/glm-4.7-flash | 0,0036 TL | 0,024 TL |
| z-ai/glm-5 | 0,0432 TL | 0,1381 TL |
| z-ai/glm-5-turbo | 0,0721 TL | 0,2402 TL |
curl -X POST https://tektik.ai/api/v1/services/summarize \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"text": "Özetlenecek alan",
"length": "Ozet boyutu",
"language": "Ozet dili"
}
}'Yanıt Örneği
{
"response": "...",
"costTl": 0.0234
}/api/v1/services/writing_helperYazım Yardımcısı
Parametreler
| Parametre | Tip | Zorunlu | Açıklama |
|---|---|---|---|
| model | string | Evet | Kullanılacak AI modeli |
| text | string | Evet | Yazı |
Kullanılabilir Modeller
| Model | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|
| openai/gpt-5.3-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.3-codex | 0,1051 TL | 0,8408 TL |
| x-ai/grok-4.1-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-code-fast-1 | 0,012 TL | 0,0901 TL |
| anthropic/claude-sonnet-4.6 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4.6 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-haiku-4.5 | 0,0601 TL | 0,3003 TL |
| ai21/jamba-large-1.7 | 0,1201 TL | 0,4805 TL |
| aion-labs/aion-1.0 | 0,2402 TL | 0,4805 TL |
| aion-labs/aion-1.0-mini | 0,042 TL | 0,0841 TL |
| aion-labs/aion-2.0 | 0,048 TL | 0,0961 TL |
| aion-labs/aion-rp-llama-3.1-8b | 0,048 TL | 0,0961 TL |
| alfredpros/codellama-7b-instruct-solidity | 0,048 TL | 0,0721 TL |
| allenai/olmo-2-0325-32b-instruct | 0,003 TL | 0,012 TL |
| allenai/olmo-3-32b-think | 0,009 TL | 0,03 TL |
| allenai/olmo-3.1-32b-instruct | 0,012 TL | 0,036 TL |
| allenai/olmo-3.1-32b-think | 0,009 TL | 0,03 TL |
| amazon/nova-2-lite-v1 | 0,018 TL | 0,1502 TL |
| amazon/nova-lite-v1 | 0,0036 TL | 0,0144 TL |
| amazon/nova-micro-v1 | 0,0021 TL | 0,0084 TL |
| amazon/nova-premier-v1 | 0,1502 TL | 0,7507 TL |
| amazon/nova-pro-v1 | 0,048 TL | 0,1922 TL |
| anthropic/claude-3-haiku | 0,015 TL | 0,0751 TL |
| anthropic/claude-3.5-haiku | 0,048 TL | 0,2402 TL |
| anthropic/claude-3.5-sonnet | 0,3604 TL | 1,8018 TL |
| anthropic/claude-3.7-sonnet | 0,1802 TL | 0,9009 TL |
| anthropic/claude-3.7-sonnet:thinking | 0,1802 TL | 0,9009 TL |
| anthropic/claude-opus-4 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.1 | 0,9009 TL | 4,5045 TL |
| anthropic/claude-opus-4.5 | 0,3003 TL | 1,5015 TL |
| anthropic/claude-sonnet-4 | 0,1802 TL | 0,9009 TL |
| anthropic/claude-sonnet-4.5 | 0,1802 TL | 0,9009 TL |
| arcee-ai/coder-large | 0,03 TL | 0,048 TL |
| arcee-ai/maestro-reasoning | 0,0541 TL | 0,1982 TL |
| arcee-ai/spotlight | 0,0108 TL | 0,0108 TL |
| arcee-ai/trinity-mini | 0,0027 TL | 0,009 TL |
| arcee-ai/virtuoso-large | 0,045 TL | 0,0721 TL |
| openrouter/auto | -60.060,00 TL | -60.060,00 TL |
| baidu/ernie-4.5-21b-a3b | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-21b-a3b-thinking | 0,0042 TL | 0,0168 TL |
| baidu/ernie-4.5-300b-a47b | 0,0168 TL | 0,0661 TL |
| baidu/ernie-4.5-vl-28b-a3b | 0,0084 TL | 0,0336 TL |
| baidu/ernie-4.5-vl-424b-a47b | 0,0252 TL | 0,0751 TL |
| openrouter/bodybuilder | -60.060,00 TL | -60.060,00 TL |
| bytedance-seed/seed-1.6 | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-1.6-flash | 0,0045 TL | 0,018 TL |
| bytedance-seed/seed-2.0-lite | 0,015 TL | 0,1201 TL |
| bytedance-seed/seed-2.0-mini | 0,006 TL | 0,024 TL |
| bytedance/ui-tars-1.5-7b | 0,006 TL | 0,012 TL |
| cohere/command-a | 0,1502 TL | 0,6006 TL |
| cohere/command-r-08-2024 | 0,009 TL | 0,036 TL |
| cohere/command-r-plus-08-2024 | 0,1502 TL | 0,6006 TL |
| cohere/command-r7b-12-2024 | 0,0023 TL | 0,009 TL |
| deepcogito/cogito-v2.1-671b | 0,0751 TL | 0,0751 TL |
| deepseek/deepseek-chat | 0,0192 TL | 0,0535 TL |
| deepseek/deepseek-chat-v3-0324 | 0,012 TL | 0,0462 TL |
| deepseek/deepseek-chat-v3.1 | 0,009 TL | 0,045 TL |
| deepseek/deepseek-v3.1-terminus | 0,0126 TL | 0,0474 TL |
| deepseek/deepseek-v3.2 | 0,0156 TL | 0,0228 TL |
| deepseek/deepseek-v3.2-exp | 0,0162 TL | 0,0246 TL |
| deepseek/deepseek-v3.2-speciale | 0,024 TL | 0,0721 TL |
| deepseek/deepseek-r1 | 0,042 TL | 0,1502 TL |
| deepseek/deepseek-r1-0528 | 0,027 TL | 0,1291 TL |
| deepseek/deepseek-r1-distill-llama-70b | 0,042 TL | 0,048 TL |
| deepseek/deepseek-r1-distill-qwen-32b | 0,0174 TL | 0,0174 TL |
| eleutherai/llemma_7b | 0,048 TL | 0,0721 TL |
| essentialai/rnj-1-instruct | 0,009 TL | 0,009 TL |
| alpindale/goliath-120b | 0,2252 TL | 0,4505 TL |
| google/gemini-2.0-flash-001 | 0,006 TL | 0,024 TL |
| google/gemini-2.0-flash-lite-001 | 0,0045 TL | 0,018 TL |
| google/gemini-2.5-flash | 0,018 TL | 0,1502 TL |
| google/gemini-2.5-flash-lite | 0,006 TL | 0,024 TL |
| google/gemini-2.5-flash-lite-preview-09-2025 | 0,006 TL | 0,024 TL |
| google/gemini-2.5-pro | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview-05-06 | 0,0751 TL | 0,6006 TL |
| google/gemini-2.5-pro-preview | 0,0751 TL | 0,6006 TL |
| google/gemini-3-flash-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3.1-flash-lite-preview | 0,015 TL | 0,0901 TL |
| google/gemini-3.1-pro-preview | 0,1201 TL | 0,7207 TL |
| google/gemini-3.1-pro-preview-customtools | 0,1201 TL | 0,7207 TL |
| google/gemma-2-27b-it | 0,039 TL | 0,039 TL |
| google/gemma-2-9b-it | 0,0018 TL | 0,0054 TL |
| google/gemma-3-12b-it | 0,0024 TL | 0,0078 TL |
| google/gemma-3-27b-it | 0,0048 TL | 0,0096 TL |
| google/gemma-3-4b-it | 0,0024 TL | 0,0048 TL |
| google/gemma-3n-e4b-it | 0,0012 TL | 0,0024 TL |
| google/gemini-2.5-flash-image | 0,018 TL | 0,1502 TL |
| google/gemini-3.1-flash-image-preview | 0,03 TL | 0,1802 TL |
| google/gemini-3-pro-image-preview | 0,1201 TL | 0,7207 TL |
| ibm-granite/granite-4.0-h-micro | 0,001 TL | 0,0066 TL |
| inception/mercury | 0,015 TL | 0,045 TL |
| inception/mercury-2 | 0,015 TL | 0,045 TL |
| inception/mercury-coder | 0,015 TL | 0,045 TL |
| inflection/inflection-3-pi | 0,1502 TL | 0,6006 TL |
| inflection/inflection-3-productivity | 0,1502 TL | 0,6006 TL |
| kwaipilot/kat-coder-pro | 0,0124 TL | 0,0497 TL |
| kwaipilot/kat-coder-pro-v2 | 0,018 TL | 0,0721 TL |
| liquid/lfm-2.2-6b | 0,0006 TL | 0,0012 TL |
| liquid/lfm-2-24b-a2b | 0,0018 TL | 0,0072 TL |
| liquid/lfm2-8b-a1b | 0,0006 TL | 0,0012 TL |
| meta-llama/llama-guard-3-8b | 0,0012 TL | 0,0036 TL |
| anthracite-org/magnum-v4-72b | 0,1802 TL | 0,3003 TL |
| mancer/weaver | 0,045 TL | 0,0601 TL |
| meituan/longcat-flash-chat | 0,012 TL | 0,048 TL |
| meta-llama/llama-3-70b-instruct | 0,0306 TL | 0,0444 TL |
| meta-llama/llama-3-8b-instruct | 0,0018 TL | 0,0024 TL |
| meta-llama/llama-3.1-70b-instruct | 0,024 TL | 0,024 TL |
| meta-llama/llama-3.1-8b-instruct | 0,0012 TL | 0,003 TL |
| meta-llama/llama-3.2-11b-vision-instruct | 0,0029 TL | 0,0029 TL |
| meta-llama/llama-3.2-1b-instruct | 0,0016 TL | 0,012 TL |
| meta-llama/llama-3.2-3b-instruct | 0,0031 TL | 0,0204 TL |
| meta-llama/llama-3.3-70b-instruct | 0,006 TL | 0,0192 TL |
| meta-llama/llama-4-maverick | 0,009 TL | 0,036 TL |
| meta-llama/llama-4-scout | 0,0048 TL | 0,018 TL |
| meta-llama/llama-guard-4-12b | 0,0108 TL | 0,0108 TL |
| microsoft/phi-4 | 0,0039 TL | 0,0084 TL |
| minimax/minimax-m1 | 0,024 TL | 0,1321 TL |
| minimax/minimax-m2 | 0,0153 TL | 0,0601 TL |
| minimax/minimax-m2-her | 0,018 TL | 0,0721 TL |
| minimax/minimax-m2.1 | 0,0162 TL | 0,0571 TL |
| minimax/minimax-m2.5 | 0,0114 TL | 0,0691 TL |
| minimax/minimax-m2.7 | 0,018 TL | 0,0721 TL |
| minimax/minimax-01 | 0,012 TL | 0,0661 TL |
| mistralai/mistral-large | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2407 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/codestral-2508 | 0,018 TL | 0,0541 TL |
| mistralai/devstral-2512 | 0,024 TL | 0,1201 TL |
| mistralai/devstral-medium | 0,024 TL | 0,1201 TL |
| mistralai/devstral-small | 0,006 TL | 0,018 TL |
| mistralai/ministral-14b-2512 | 0,012 TL | 0,012 TL |
| mistralai/ministral-3b-2512 | 0,006 TL | 0,006 TL |
| mistralai/ministral-8b-2512 | 0,009 TL | 0,009 TL |
| mistralai/mistral-7b-instruct-v0.1 | 0,0066 TL | 0,0114 TL |
| mistralai/mistral-large-2512 | 0,03 TL | 0,0901 TL |
| mistralai/mistral-medium-3 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-medium-3.1 | 0,024 TL | 0,1201 TL |
| mistralai/mistral-nemo | 0,0012 TL | 0,0024 TL |
| mistralai/mistral-small-24b-instruct-2501 | 0,003 TL | 0,0048 TL |
| mistralai/mistral-small-3.1-24b-instruct | 0,0018 TL | 0,0066 TL |
| mistralai/mistral-small-3.2-24b-instruct | 0,0045 TL | 0,012 TL |
| mistralai/mistral-small-2603 | 0,009 TL | 0,036 TL |
| mistralai/mistral-small-creative | 0,006 TL | 0,018 TL |
| mistralai/mixtral-8x22b-instruct | 0,1201 TL | 0,3604 TL |
| mistralai/mixtral-8x7b-instruct | 0,0324 TL | 0,0324 TL |
| mistralai/pixtral-large-2411 | 0,1201 TL | 0,3604 TL |
| mistralai/mistral-saba | 0,012 TL | 0,036 TL |
| mistralai/voxtral-small-24b-2507 | 0,006 TL | 0,018 TL |
| moonshotai/kimi-k2 | 0,0342 TL | 0,1381 TL |
| moonshotai/kimi-k2-0905 | 0,024 TL | 0,1201 TL |
| moonshotai/kimi-k2-thinking | 0,0282 TL | 0,1201 TL |
| moonshotai/kimi-k2.5 | 0,0252 TL | 0,1321 TL |
| morph/morph-v3-fast | 0,048 TL | 0,0721 TL |
| morph/morph-v3-large | 0,0541 TL | 0,1141 TL |
| gryphe/mythomax-l2-13b | 0,0036 TL | 0,0036 TL |
| nex-agi/deepseek-v3.1-nex-n1 | 0,0081 TL | 0,03 TL |
| nousresearch/hermes-3-llama-3.1-405b | 0,0601 TL | 0,0601 TL |
| nousresearch/hermes-3-llama-3.1-70b | 0,018 TL | 0,018 TL |
| nousresearch/hermes-4-405b | 0,0601 TL | 0,1802 TL |
| nousresearch/hermes-4-70b | 0,0078 TL | 0,024 TL |
| nousresearch/hermes-2-pro-llama-3-8b | 0,0084 TL | 0,0084 TL |
| nvidia/llama-3.1-nemotron-70b-instruct | 0,0721 TL | 0,0721 TL |
| nvidia/llama-3.1-nemotron-ultra-253b-v1 | 0,036 TL | 0,1081 TL |
| nvidia/llama-3.3-nemotron-super-49b-v1.5 | 0,006 TL | 0,024 TL |
| nvidia/nemotron-3-nano-30b-a3b | 0,003 TL | 0,012 TL |
| nvidia/nemotron-3-super-120b-a12b | 0,006 TL | 0,03 TL |
| nvidia/nemotron-nano-12b-v2-vl | 0,012 TL | 0,036 TL |
| nvidia/nemotron-nano-9b-v2 | 0,0024 TL | 0,0096 TL |
| openai/gpt-audio | 0,1502 TL | 0,6006 TL |
| openai/gpt-audio-mini | 0,036 TL | 0,1441 TL |
| openai/gpt-3.5-turbo | 0,03 TL | 0,0901 TL |
| openai/gpt-3.5-turbo-0613 | 0,0601 TL | 0,1201 TL |
| openai/gpt-3.5-turbo-16k | 0,1802 TL | 0,2402 TL |
| openai/gpt-3.5-turbo-instruct | 0,0901 TL | 0,1201 TL |
| openai/gpt-4 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-0314 | 1,8018 TL | 3,6036 TL |
| openai/gpt-4-turbo | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-1106-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4-turbo-preview | 0,6006 TL | 1,8018 TL |
| openai/gpt-4.1 | 0,1201 TL | 0,4805 TL |
| openai/gpt-4.1-mini | 0,024 TL | 0,0961 TL |
| openai/gpt-4.1-nano | 0,006 TL | 0,024 TL |
| openai/gpt-4o | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-05-13 | 0,3003 TL | 0,9009 TL |
| openai/gpt-4o-2024-08-06 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-2024-11-20 | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o:extended | 0,3604 TL | 1,0811 TL |
| openai/gpt-4o-audio-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-search-preview | 0,1502 TL | 0,6006 TL |
| openai/gpt-4o-mini | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-2024-07-18 | 0,009 TL | 0,036 TL |
| openai/gpt-4o-mini-search-preview | 0,009 TL | 0,036 TL |
| openai/gpt-5 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5-image | 0,6006 TL | 0,6006 TL |
| openai/gpt-5-image-mini | 0,1502 TL | 0,1201 TL |
| openai/gpt-5-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5-nano | 0,003 TL | 0,024 TL |
| openai/gpt-5-pro | 0,9009 TL | 7,2072 TL |
| openai/gpt-5.1 | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-chat | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-max | 0,0751 TL | 0,6006 TL |
| openai/gpt-5.1-codex-mini | 0,015 TL | 0,1201 TL |
| openai/gpt-5.2 | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-chat | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.2-pro | 1,2613 TL | 10,0901 TL |
| openai/gpt-5.2-codex | 0,1051 TL | 0,8408 TL |
| openai/gpt-5.4 | 0,1502 TL | 0,9009 TL |
| openai/gpt-5.4-mini | 0,045 TL | 0,2703 TL |
| openai/gpt-5.4-nano | 0,012 TL | 0,0751 TL |
| openai/gpt-5.4-pro | 1,8018 TL | 10,8108 TL |
| openai/gpt-oss-120b | 0,0023 TL | 0,0114 TL |
| openai/gpt-oss-20b | 0,0018 TL | 0,0066 TL |
| openai/gpt-oss-safeguard-20b | 0,0045 TL | 0,018 TL |
| openai/o1 | 0,9009 TL | 3,6036 TL |
| openai/o1-pro | 9,009 TL | 36,036 TL |
| openai/o3 | 0,1201 TL | 0,4805 TL |
| openai/o3-deep-research | 0,6006 TL | 2,4024 TL |
| openai/o3-mini | 0,0661 TL | 0,2643 TL |
| openai/o3-mini-high | 0,0661 TL | 0,2643 TL |
| openai/o3-pro | 1,2012 TL | 4,8048 TL |
| openai/o4-mini | 0,0661 TL | 0,2643 TL |
| openai/o4-mini-deep-research | 0,1201 TL | 0,4805 TL |
| openai/o4-mini-high | 0,0661 TL | 0,2643 TL |
| perplexity/sonar | 0,0601 TL | 0,0601 TL |
| perplexity/sonar-deep-research | 0,1201 TL | 0,4805 TL |
| perplexity/sonar-pro | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-pro-search | 0,1802 TL | 0,9009 TL |
| perplexity/sonar-reasoning-pro | 0,1201 TL | 0,4805 TL |
| prime-intellect/intellect-3 | 0,012 TL | 0,0661 TL |
| qwen/qwen-plus-2025-07-28 | 0,0156 TL | 0,0468 TL |
| qwen/qwen-plus-2025-07-28:thinking | 0,0156 TL | 0,0468 TL |
| qwen/qwen-vl-max | 0,0312 TL | 0,1249 TL |
| qwen/qwen-vl-plus | 0,0082 TL | 0,0246 TL |
| qwen/qwen-max | 0,0625 TL | 0,2498 TL |
| qwen/qwen-plus | 0,0156 TL | 0,0468 TL |
| qwen/qwen-turbo | 0,002 TL | 0,0078 TL |
| qwen/qwen-2.5-7b-instruct | 0,0024 TL | 0,006 TL |
| qwen/qwen2.5-coder-7b-instruct | 0,0018 TL | 0,0054 TL |
| qwen/qwen2.5-vl-32b-instruct | 0,012 TL | 0,036 TL |
| qwen/qwen2.5-vl-72b-instruct | 0,048 TL | 0,048 TL |
| qwen/qwen3-14b | 0,0036 TL | 0,0144 TL |
| qwen/qwen3-235b-a22b | 0,0273 TL | 0,1093 TL |
| qwen/qwen3-235b-a22b-2507 | 0,0043 TL | 0,006 TL |
| qwen/qwen3-235b-a22b-thinking-2507 | 0,009 TL | 0,0898 TL |
| qwen/qwen3-30b-a3b | 0,0048 TL | 0,0168 TL |
| qwen/qwen3-30b-a3b-instruct-2507 | 0,0054 TL | 0,018 TL |
| qwen/qwen3-30b-a3b-thinking-2507 | 0,0048 TL | 0,024 TL |
| qwen/qwen3-32b | 0,0048 TL | 0,0144 TL |
| qwen/qwen3-8b | 0,003 TL | 0,024 TL |
| qwen/qwen3-coder-30b-a3b-instruct | 0,0042 TL | 0,0162 TL |
| qwen/qwen3-coder | 0,0132 TL | 0,0601 TL |
| qwen/qwen3-coder-flash | 0,0117 TL | 0,0586 TL |
| qwen/qwen3-coder-next | 0,0072 TL | 0,045 TL |
| qwen/qwen3-coder-plus | 0,039 TL | 0,1952 TL |
| qwen/qwen3-max | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-max-thinking | 0,0468 TL | 0,2342 TL |
| qwen/qwen3-next-80b-a3b-instruct | 0,0054 TL | 0,0661 TL |
| qwen/qwen3-next-80b-a3b-thinking | 0,0059 TL | 0,0468 TL |
| qwen/qwen3-vl-235b-a22b-instruct | 0,012 TL | 0,0529 TL |
| qwen/qwen3-vl-235b-a22b-thinking | 0,0156 TL | 0,1562 TL |
| qwen/qwen3-vl-30b-a3b-instruct | 0,0078 TL | 0,0312 TL |
| qwen/qwen3-vl-30b-a3b-thinking | 0,0078 TL | 0,0937 TL |
| qwen/qwen3-vl-32b-instruct | 0,0062 TL | 0,025 TL |
| qwen/qwen3-vl-8b-instruct | 0,0048 TL | 0,03 TL |
| qwen/qwen3-vl-8b-thinking | 0,007 TL | 0,082 TL |
| qwen/qwen3.5-397b-a17b | 0,0234 TL | 0,1405 TL |
| qwen/qwen3.5-plus-02-15 | 0,0156 TL | 0,0937 TL |
| qwen/qwen3.5-122b-a10b | 0,0156 TL | 0,1249 TL |
| qwen/qwen3.5-27b | 0,0117 TL | 0,0937 TL |
| qwen/qwen3.5-35b-a3b | 0,0098 TL | 0,0781 TL |
| qwen/qwen3.5-9b | 0,003 TL | 0,009 TL |
| qwen/qwen3.5-flash-02-23 | 0,0039 TL | 0,0156 TL |
| qwen/qwq-32b | 0,009 TL | 0,0348 TL |
| qwen/qwen-2.5-72b-instruct | 0,0072 TL | 0,0234 TL |
| qwen/qwen-2.5-coder-32b-instruct | 0,0396 TL | 0,0601 TL |
| reka/reka-edge | 0,006 TL | 0,006 TL |
| relace/relace-apply-3 | 0,0511 TL | 0,0751 TL |
| relace/relace-search | 0,0601 TL | 0,1802 TL |
| undi95/remm-slerp-l2-13b | 0,027 TL | 0,039 TL |
| sao10k/l3-lunaris-8b | 0,0024 TL | 0,003 TL |
| sao10k/l3-euryale-70b | 0,0889 TL | 0,0889 TL |
| sao10k/l3.1-70b-hanami-x1 | 0,1802 TL | 0,1802 TL |
| sao10k/l3.1-euryale-70b | 0,0511 TL | 0,0511 TL |
| sao10k/l3.3-euryale-70b | 0,039 TL | 0,045 TL |
| stepfun/step-3.5-flash | 0,006 TL | 0,018 TL |
| switchpoint/router | 0,0511 TL | 0,2042 TL |
| tencent/hunyuan-a13b-instruct | 0,0084 TL | 0,0342 TL |
| thedrummer/cydonia-24b-v4.1 | 0,018 TL | 0,03 TL |
| thedrummer/rocinante-12b | 0,0102 TL | 0,0258 TL |
| thedrummer/skyfall-36b-v2 | 0,033 TL | 0,048 TL |
| thedrummer/unslopnemo-12b | 0,024 TL | 0,024 TL |
| tngtech/deepseek-r1t2-chimera | 0,018 TL | 0,0661 TL |
| alibaba/tongyi-deepresearch-30b-a3b | 0,0054 TL | 0,027 TL |
| upstage/solar-pro-3 | 0,009 TL | 0,036 TL |
| microsoft/wizardlm-2-8x22b | 0,0372 TL | 0,0372 TL |
| writer/palmyra-x5 | 0,036 TL | 0,3604 TL |
| x-ai/grok-3 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-beta | 0,1802 TL | 0,9009 TL |
| x-ai/grok-3-mini | 0,018 TL | 0,03 TL |
| x-ai/grok-3-mini-beta | 0,018 TL | 0,03 TL |
| x-ai/grok-4 | 0,1802 TL | 0,9009 TL |
| x-ai/grok-4-fast | 0,012 TL | 0,03 TL |
| x-ai/grok-4.20-beta | 0,1201 TL | 0,3604 TL |
| x-ai/grok-4.20-multi-agent-beta | 0,1201 TL | 0,3604 TL |
| xiaomi/mimo-v2-flash | 0,0054 TL | 0,0174 TL |
| xiaomi/mimo-v2-omni | 0,024 TL | 0,1201 TL |
| xiaomi/mimo-v2-pro | 0,0601 TL | 0,1802 TL |
| z-ai/glm-4-32b | 0,006 TL | 0,006 TL |
| z-ai/glm-4.5 | 0,036 TL | 0,1321 TL |
| z-ai/glm-4.5-air | 0,0078 TL | 0,0511 TL |
| z-ai/glm-4.5v | 0,036 TL | 0,1081 TL |
| z-ai/glm-4.6 | 0,0234 TL | 0,1141 TL |
| z-ai/glm-4.6v | 0,018 TL | 0,0541 TL |
| z-ai/glm-4.7 | 0,0234 TL | 0,1051 TL |
| z-ai/glm-4.7-flash | 0,0036 TL | 0,024 TL |
| z-ai/glm-5 | 0,0432 TL | 0,1381 TL |
| z-ai/glm-5-turbo | 0,0721 TL | 0,2402 TL |
curl -X POST https://tektik.ai/api/v1/services/writing_helper \
-H "Authorization: Bearer tkai_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4o",
"request": {
"text": "Yazı"
}
}'Yanıt Örneği
{
"response": "...",
"costTl": 0.0234
}Modeller
Kullanılabilir AI modellerini listeleyin.
/api/v1/modelsTüm aktif modelleri ve güncel TL fiyatlarını döner.
curl https://tektik.ai/api/v1/models \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Çeviri
| Model | Türkçe Kalitesi | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|---|
| OpenAI: GPT-5.3 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.3-Codex | - | 0,1051 TL | 0,8408 TL |
| xAI: Grok 4.1 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok Code Fast 1 | - | 0,012 TL | 0,0901 TL |
| Anthropic: Claude Sonnet 4.6 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4.6 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Haiku 4.5 | - | 0,0601 TL | 0,3003 TL |
| AI21: Jamba Large 1.7 | - | 0,1201 TL | 0,4805 TL |
| AionLabs: Aion-1.0 | - | 0,2402 TL | 0,4805 TL |
| AionLabs: Aion-1.0-Mini | - | 0,042 TL | 0,0841 TL |
| AionLabs: Aion-2.0 | - | 0,048 TL | 0,0961 TL |
| AionLabs: Aion-RP 1.0 (8B) | - | 0,048 TL | 0,0961 TL |
| AlfredPros: CodeLLaMa 7B Instruct Solidity | - | 0,048 TL | 0,0721 TL |
| AllenAI: Olmo 2 32B Instruct | - | 0,003 TL | 0,012 TL |
| AllenAI: Olmo 3 32B Think | - | 0,009 TL | 0,03 TL |
| AllenAI: Olmo 3.1 32B Instruct | - | 0,012 TL | 0,036 TL |
| AllenAI: Olmo 3.1 32B Think | - | 0,009 TL | 0,03 TL |
| Amazon: Nova 2 Lite | - | 0,018 TL | 0,1502 TL |
| Amazon: Nova Lite 1.0 | - | 0,0036 TL | 0,0144 TL |
| Amazon: Nova Micro 1.0 | - | 0,0021 TL | 0,0084 TL |
| Amazon: Nova Premier 1.0 | - | 0,1502 TL | 0,7507 TL |
| Amazon: Nova Pro 1.0 | - | 0,048 TL | 0,1922 TL |
| Anthropic: Claude 3 Haiku | - | 0,015 TL | 0,0751 TL |
| Anthropic: Claude 3.5 Haiku | - | 0,048 TL | 0,2402 TL |
| Anthropic: Claude 3.5 Sonnet | - | 0,3604 TL | 1,8018 TL |
| Anthropic: Claude 3.7 Sonnet | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude 3.7 Sonnet (thinking) | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.1 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.5 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Sonnet 4 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Sonnet 4.5 | - | 0,1802 TL | 0,9009 TL |
| Arcee AI: Coder Large | - | 0,03 TL | 0,048 TL |
| Arcee AI: Maestro Reasoning | - | 0,0541 TL | 0,1982 TL |
| Arcee AI: Spotlight | - | 0,0108 TL | 0,0108 TL |
| Arcee AI: Trinity Mini | - | 0,0027 TL | 0,009 TL |
| Arcee AI: Virtuoso Large | - | 0,045 TL | 0,0721 TL |
| Auto Router | - | -60.060,00 TL | -60.060,00 TL |
| Baidu: ERNIE 4.5 21B A3B | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 21B A3B Thinking | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 300B A47B | - | 0,0168 TL | 0,0661 TL |
| Baidu: ERNIE 4.5 VL 28B A3B | - | 0,0084 TL | 0,0336 TL |
| Baidu: ERNIE 4.5 VL 424B A47B | - | 0,0252 TL | 0,0751 TL |
| Body Builder (beta) | - | -60.060,00 TL | -60.060,00 TL |
| ByteDance Seed: Seed 1.6 | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed 1.6 Flash | - | 0,0045 TL | 0,018 TL |
| ByteDance Seed: Seed-2.0-Lite | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed-2.0-Mini | - | 0,006 TL | 0,024 TL |
| ByteDance: UI-TARS 7B | - | 0,006 TL | 0,012 TL |
| Cohere: Command A | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R (08-2024) | - | 0,009 TL | 0,036 TL |
| Cohere: Command R+ (08-2024) | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R7B (12-2024) | - | 0,0023 TL | 0,009 TL |
| Deep Cogito: Cogito v2.1 671B | - | 0,0751 TL | 0,0751 TL |
| DeepSeek: DeepSeek V3 | - | 0,0192 TL | 0,0535 TL |
| DeepSeek: DeepSeek V3 0324 | - | 0,012 TL | 0,0462 TL |
| DeepSeek: DeepSeek V3.1 | - | 0,009 TL | 0,045 TL |
| DeepSeek: DeepSeek V3.1 Terminus | - | 0,0126 TL | 0,0474 TL |
| DeepSeek: DeepSeek V3.2 | - | 0,0156 TL | 0,0228 TL |
| DeepSeek: DeepSeek V3.2 Exp | - | 0,0162 TL | 0,0246 TL |
| DeepSeek: DeepSeek V3.2 Speciale | - | 0,024 TL | 0,0721 TL |
| DeepSeek: R1 | - | 0,042 TL | 0,1502 TL |
| DeepSeek: R1 0528 | - | 0,027 TL | 0,1291 TL |
| DeepSeek: R1 Distill Llama 70B | - | 0,042 TL | 0,048 TL |
| DeepSeek: R1 Distill Qwen 32B | - | 0,0174 TL | 0,0174 TL |
| EleutherAI: Llemma 7b | - | 0,048 TL | 0,0721 TL |
| EssentialAI: Rnj 1 Instruct | - | 0,009 TL | 0,009 TL |
| Goliath 120B | - | 0,2252 TL | 0,4505 TL |
| Google: Gemini 2.0 Flash | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.0 Flash Lite | - | 0,0045 TL | 0,018 TL |
| Google: Gemini 2.5 Flash | - | 0,018 TL | 0,1502 TL |
| Google: Gemini 2.5 Flash Lite | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Flash Lite Preview 09-2025 | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Pro | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 05-06 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 06-05 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 3 Flash Preview | - | 0,03 TL | 0,1802 TL |
| Google: Gemini 3.1 Flash Lite Preview | - | 0,015 TL | 0,0901 TL |
| Google: Gemini 3.1 Pro Preview | - | 0,1201 TL | 0,7207 TL |
| Google: Gemini 3.1 Pro Preview Custom Tools | - | 0,1201 TL | 0,7207 TL |
| Google: Gemma 2 27B | - | 0,039 TL | 0,039 TL |
| Google: Gemma 2 9B | - | 0,0018 TL | 0,0054 TL |
| Google: Gemma 3 12B | - | 0,0024 TL | 0,0078 TL |
| Google: Gemma 3 27B | - | 0,0048 TL | 0,0096 TL |
| Google: Gemma 3 4B | - | 0,0024 TL | 0,0048 TL |
| Google: Gemma 3n 4B | - | 0,0012 TL | 0,0024 TL |
| Google: Nano Banana (Gemini 2.5 Flash Image) | - | 0,018 TL | 0,1502 TL |
| Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview) | - | 0,03 TL | 0,1802 TL |
| Google: Nano Banana Pro (Gemini 3 Pro Image Preview) | - | 0,1201 TL | 0,7207 TL |
| IBM: Granite 4.0 Micro | - | 0,001 TL | 0,0066 TL |
| Inception: Mercury | - | 0,015 TL | 0,045 TL |
| Inception: Mercury 2 | - | 0,015 TL | 0,045 TL |
| Inception: Mercury Coder | - | 0,015 TL | 0,045 TL |
| Inflection: Inflection 3 Pi | - | 0,1502 TL | 0,6006 TL |
| Inflection: Inflection 3 Productivity | - | 0,1502 TL | 0,6006 TL |
| Kwaipilot: KAT-Coder-Pro V1 | - | 0,0124 TL | 0,0497 TL |
| Kwaipilot: KAT-Coder-Pro V2 | - | 0,018 TL | 0,0721 TL |
| LiquidAI: LFM2-2.6B | - | 0,0006 TL | 0,0012 TL |
| LiquidAI: LFM2-24B-A2B | - | 0,0018 TL | 0,0072 TL |
| LiquidAI: LFM2-8B-A1B | - | 0,0006 TL | 0,0012 TL |
| Llama Guard 3 8B | - | 0,0012 TL | 0,0036 TL |
| Magnum v4 72B | - | 0,1802 TL | 0,3003 TL |
| Mancer: Weaver (alpha) | - | 0,045 TL | 0,0601 TL |
| Meituan: LongCat Flash Chat | - | 0,012 TL | 0,048 TL |
| Meta: Llama 3 70B Instruct | - | 0,0306 TL | 0,0444 TL |
| Meta: Llama 3 8B Instruct | - | 0,0018 TL | 0,0024 TL |
| Meta: Llama 3.1 70B Instruct | - | 0,024 TL | 0,024 TL |
| Meta: Llama 3.1 8B Instruct | - | 0,0012 TL | 0,003 TL |
| Meta: Llama 3.2 11B Vision Instruct | - | 0,0029 TL | 0,0029 TL |
| Meta: Llama 3.2 1B Instruct | - | 0,0016 TL | 0,012 TL |
| Meta: Llama 3.2 3B Instruct | - | 0,0031 TL | 0,0204 TL |
| Meta: Llama 3.3 70B Instruct | - | 0,006 TL | 0,0192 TL |
| Meta: Llama 4 Maverick | - | 0,009 TL | 0,036 TL |
| Meta: Llama 4 Scout | - | 0,0048 TL | 0,018 TL |
| Meta: Llama Guard 4 12B | - | 0,0108 TL | 0,0108 TL |
| Microsoft: Phi 4 | - | 0,0039 TL | 0,0084 TL |
| MiniMax: MiniMax M1 | - | 0,024 TL | 0,1321 TL |
| MiniMax: MiniMax M2 | - | 0,0153 TL | 0,0601 TL |
| MiniMax: MiniMax M2-her | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax M2.1 | - | 0,0162 TL | 0,0571 TL |
| MiniMax: MiniMax M2.5 | - | 0,0114 TL | 0,0691 TL |
| MiniMax: MiniMax M2.7 | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax-01 | - | 0,012 TL | 0,0661 TL |
| Mistral Large | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2407 | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Codestral 2508 | - | 0,018 TL | 0,0541 TL |
| Mistral: Devstral 2 2512 | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Medium | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Small 1.1 | - | 0,006 TL | 0,018 TL |
| Mistral: Ministral 3 14B 2512 | - | 0,012 TL | 0,012 TL |
| Mistral: Ministral 3 3B 2512 | - | 0,006 TL | 0,006 TL |
| Mistral: Ministral 3 8B 2512 | - | 0,009 TL | 0,009 TL |
| Mistral: Mistral 7B Instruct v0.1 | - | 0,0066 TL | 0,0114 TL |
| Mistral: Mistral Large 3 2512 | - | 0,03 TL | 0,0901 TL |
| Mistral: Mistral Medium 3 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Medium 3.1 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Nemo | - | 0,0012 TL | 0,0024 TL |
| Mistral: Mistral Small 3 | - | 0,003 TL | 0,0048 TL |
| Mistral: Mistral Small 3.1 24B | - | 0,0018 TL | 0,0066 TL |
| Mistral: Mistral Small 3.2 24B | - | 0,0045 TL | 0,012 TL |
| Mistral: Mistral Small 4 | - | 0,009 TL | 0,036 TL |
| Mistral: Mistral Small Creative | - | 0,006 TL | 0,018 TL |
| Mistral: Mixtral 8x22B Instruct | - | 0,1201 TL | 0,3604 TL |
| Mistral: Mixtral 8x7B Instruct | - | 0,0324 TL | 0,0324 TL |
| Mistral: Pixtral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Saba | - | 0,012 TL | 0,036 TL |
| Mistral: Voxtral Small 24B 2507 | - | 0,006 TL | 0,018 TL |
| MoonshotAI: Kimi K2 0711 | - | 0,0342 TL | 0,1381 TL |
| MoonshotAI: Kimi K2 0905 | - | 0,024 TL | 0,1201 TL |
| MoonshotAI: Kimi K2 Thinking | - | 0,0282 TL | 0,1201 TL |
| MoonshotAI: Kimi K2.5 | - | 0,0252 TL | 0,1321 TL |
| Morph: Morph V3 Fast | - | 0,048 TL | 0,0721 TL |
| Morph: Morph V3 Large | - | 0,0541 TL | 0,1141 TL |
| MythoMax 13B | - | 0,0036 TL | 0,0036 TL |
| Nex AGI: DeepSeek V3.1 Nex N1 | - | 0,0081 TL | 0,03 TL |
| Nous: Hermes 3 405B Instruct | - | 0,0601 TL | 0,0601 TL |
| Nous: Hermes 3 70B Instruct | - | 0,018 TL | 0,018 TL |
| Nous: Hermes 4 405B | - | 0,0601 TL | 0,1802 TL |
| Nous: Hermes 4 70B | - | 0,0078 TL | 0,024 TL |
| NousResearch: Hermes 2 Pro - Llama-3 8B | - | 0,0084 TL | 0,0084 TL |
| NVIDIA: Llama 3.1 Nemotron 70B Instruct | - | 0,0721 TL | 0,0721 TL |
| NVIDIA: Llama 3.1 Nemotron Ultra 253B v1 | - | 0,036 TL | 0,1081 TL |
| NVIDIA: Llama 3.3 Nemotron Super 49B V1.5 | - | 0,006 TL | 0,024 TL |
| NVIDIA: Nemotron 3 Nano 30B A3B | - | 0,003 TL | 0,012 TL |
| NVIDIA: Nemotron 3 Super | - | 0,006 TL | 0,03 TL |
| NVIDIA: Nemotron Nano 12B 2 VL | - | 0,012 TL | 0,036 TL |
| NVIDIA: Nemotron Nano 9B V2 | - | 0,0024 TL | 0,0096 TL |
| OpenAI: GPT Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT Audio Mini | - | 0,036 TL | 0,1441 TL |
| OpenAI: GPT-3.5 Turbo | - | 0,03 TL | 0,0901 TL |
| OpenAI: GPT-3.5 Turbo (older v0613) | - | 0,0601 TL | 0,1201 TL |
| OpenAI: GPT-3.5 Turbo 16k | - | 0,1802 TL | 0,2402 TL |
| OpenAI: GPT-3.5 Turbo Instruct | - | 0,0901 TL | 0,1201 TL |
| OpenAI: GPT-4 | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 (older v0314) | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 Turbo | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo (older v1106) | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo Preview | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4.1 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: GPT-4.1 Mini | - | 0,024 TL | 0,0961 TL |
| OpenAI: GPT-4.1 Nano | - | 0,006 TL | 0,024 TL |
| OpenAI: GPT-4o | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-05-13) | - | 0,3003 TL | 0,9009 TL |
| OpenAI: GPT-4o (2024-08-06) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-11-20) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (extended) | - | 0,3604 TL | 1,0811 TL |
| OpenAI: GPT-4o Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o Search Preview | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o-mini | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini (2024-07-18) | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini Search Preview | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-5 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Image | - | 0,6006 TL | 0,6006 TL |
| OpenAI: GPT-5 Image Mini | - | 0,1502 TL | 0,1201 TL |
| OpenAI: GPT-5 Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5 Nano | - | 0,003 TL | 0,024 TL |
| OpenAI: GPT-5 Pro | - | 0,9009 TL | 7,2072 TL |
| OpenAI: GPT-5.1 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Max | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5.2 | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Pro | - | 1,2613 TL | 10,0901 TL |
| OpenAI: GPT-5.2-Codex | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.4 | - | 0,1502 TL | 0,9009 TL |
| OpenAI: GPT-5.4 Mini | - | 0,045 TL | 0,2703 TL |
| OpenAI: GPT-5.4 Nano | - | 0,012 TL | 0,0751 TL |
| OpenAI: GPT-5.4 Pro | - | 1,8018 TL | 10,8108 TL |
| OpenAI: gpt-oss-120b | - | 0,0023 TL | 0,0114 TL |
| OpenAI: gpt-oss-20b | - | 0,0018 TL | 0,0066 TL |
| OpenAI: gpt-oss-safeguard-20b | - | 0,0045 TL | 0,018 TL |
| OpenAI: o1 | - | 0,9009 TL | 3,6036 TL |
| OpenAI: o1-pro | - | 9,009 TL | 36,036 TL |
| OpenAI: o3 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o3 Deep Research | - | 0,6006 TL | 2,4024 TL |
| OpenAI: o3 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Mini High | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Pro | - | 1,2012 TL | 4,8048 TL |
| OpenAI: o4 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o4 Mini Deep Research | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o4 Mini High | - | 0,0661 TL | 0,2643 TL |
| Perplexity: Sonar | - | 0,0601 TL | 0,0601 TL |
| Perplexity: Sonar Deep Research | - | 0,1201 TL | 0,4805 TL |
| Perplexity: Sonar Pro | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Pro Search | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Reasoning Pro | - | 0,1201 TL | 0,4805 TL |
| Prime Intellect: INTELLECT-3 | - | 0,012 TL | 0,0661 TL |
| Qwen: Qwen Plus 0728 | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen Plus 0728 (thinking) | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen VL Max | - | 0,0312 TL | 0,1249 TL |
| Qwen: Qwen VL Plus | - | 0,0082 TL | 0,0246 TL |
| Qwen: Qwen-Max | - | 0,0625 TL | 0,2498 TL |
| Qwen: Qwen-Plus | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen-Turbo | - | 0,002 TL | 0,0078 TL |
| Qwen: Qwen2.5 7B Instruct | - | 0,0024 TL | 0,006 TL |
| Qwen: Qwen2.5 Coder 7B Instruct | - | 0,0018 TL | 0,0054 TL |
| Qwen: Qwen2.5 VL 32B Instruct | - | 0,012 TL | 0,036 TL |
| Qwen: Qwen2.5 VL 72B Instruct | - | 0,048 TL | 0,048 TL |
| Qwen: Qwen3 14B | - | 0,0036 TL | 0,0144 TL |
| Qwen: Qwen3 235B A22B | - | 0,0273 TL | 0,1093 TL |
| Qwen: Qwen3 235B A22B Instruct 2507 | - | 0,0043 TL | 0,006 TL |
| Qwen: Qwen3 235B A22B Thinking 2507 | - | 0,009 TL | 0,0898 TL |
| Qwen: Qwen3 30B A3B | - | 0,0048 TL | 0,0168 TL |
| Qwen: Qwen3 30B A3B Instruct 2507 | - | 0,0054 TL | 0,018 TL |
| Qwen: Qwen3 30B A3B Thinking 2507 | - | 0,0048 TL | 0,024 TL |
| Qwen: Qwen3 32B | - | 0,0048 TL | 0,0144 TL |
| Qwen: Qwen3 8B | - | 0,003 TL | 0,024 TL |
| Qwen: Qwen3 Coder 30B A3B Instruct | - | 0,0042 TL | 0,0162 TL |
| Qwen: Qwen3 Coder 480B A35B | - | 0,0132 TL | 0,0601 TL |
| Qwen: Qwen3 Coder Flash | - | 0,0117 TL | 0,0586 TL |
| Qwen: Qwen3 Coder Next | - | 0,0072 TL | 0,045 TL |
| Qwen: Qwen3 Coder Plus | - | 0,039 TL | 0,1952 TL |
| Qwen: Qwen3 Max | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Max Thinking | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Next 80B A3B Instruct | - | 0,0054 TL | 0,0661 TL |
| Qwen: Qwen3 Next 80B A3B Thinking | - | 0,0059 TL | 0,0468 TL |
| Qwen: Qwen3 VL 235B A22B Instruct | - | 0,012 TL | 0,0529 TL |
| Qwen: Qwen3 VL 235B A22B Thinking | - | 0,0156 TL | 0,1562 TL |
| Qwen: Qwen3 VL 30B A3B Instruct | - | 0,0078 TL | 0,0312 TL |
| Qwen: Qwen3 VL 30B A3B Thinking | - | 0,0078 TL | 0,0937 TL |
| Qwen: Qwen3 VL 32B Instruct | - | 0,0062 TL | 0,025 TL |
| Qwen: Qwen3 VL 8B Instruct | - | 0,0048 TL | 0,03 TL |
| Qwen: Qwen3 VL 8B Thinking | - | 0,007 TL | 0,082 TL |
| Qwen: Qwen3.5 397B A17B | - | 0,0234 TL | 0,1405 TL |
| Qwen: Qwen3.5 Plus 2026-02-15 | - | 0,0156 TL | 0,0937 TL |
| Qwen: Qwen3.5-122B-A10B | - | 0,0156 TL | 0,1249 TL |
| Qwen: Qwen3.5-27B | - | 0,0117 TL | 0,0937 TL |
| Qwen: Qwen3.5-35B-A3B | - | 0,0098 TL | 0,0781 TL |
| Qwen: Qwen3.5-9B | - | 0,003 TL | 0,009 TL |
| Qwen: Qwen3.5-Flash | - | 0,0039 TL | 0,0156 TL |
| Qwen: QwQ 32B | - | 0,009 TL | 0,0348 TL |
| Qwen2.5 72B Instruct | - | 0,0072 TL | 0,0234 TL |
| Qwen2.5 Coder 32B Instruct | - | 0,0396 TL | 0,0601 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| Relace: Relace Apply 3 | - | 0,0511 TL | 0,0751 TL |
| Relace: Relace Search | - | 0,0601 TL | 0,1802 TL |
| ReMM SLERP 13B | - | 0,027 TL | 0,039 TL |
| Sao10K: Llama 3 8B Lunaris | - | 0,0024 TL | 0,003 TL |
| Sao10k: Llama 3 Euryale 70B v2.1 | - | 0,0889 TL | 0,0889 TL |
| Sao10K: Llama 3.1 70B Hanami x1 | - | 0,1802 TL | 0,1802 TL |
| Sao10K: Llama 3.1 Euryale 70B v2.2 | - | 0,0511 TL | 0,0511 TL |
| Sao10K: Llama 3.3 Euryale 70B | - | 0,039 TL | 0,045 TL |
| StepFun: Step 3.5 Flash | - | 0,006 TL | 0,018 TL |
| Switchpoint Router | - | 0,0511 TL | 0,2042 TL |
| Tencent: Hunyuan A13B Instruct | - | 0,0084 TL | 0,0342 TL |
| TheDrummer: Cydonia 24B V4.1 | - | 0,018 TL | 0,03 TL |
| TheDrummer: Rocinante 12B | - | 0,0102 TL | 0,0258 TL |
| TheDrummer: Skyfall 36B V2 | - | 0,033 TL | 0,048 TL |
| TheDrummer: UnslopNemo 12B | - | 0,024 TL | 0,024 TL |
| TNG: DeepSeek R1T2 Chimera | - | 0,018 TL | 0,0661 TL |
| Tongyi DeepResearch 30B A3B | - | 0,0054 TL | 0,027 TL |
| Upstage: Solar Pro 3 | - | 0,009 TL | 0,036 TL |
| WizardLM-2 8x22B | - | 0,0372 TL | 0,0372 TL |
| Writer: Palmyra X5 | - | 0,036 TL | 0,3604 TL |
| xAI: Grok 3 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Beta | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Mini | - | 0,018 TL | 0,03 TL |
| xAI: Grok 3 Mini Beta | - | 0,018 TL | 0,03 TL |
| xAI: Grok 4 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 4 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok 4.20 Beta | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent Beta | - | 0,1201 TL | 0,3604 TL |
| Xiaomi: MiMo-V2-Flash | - | 0,0054 TL | 0,0174 TL |
| Xiaomi: MiMo-V2-Omni | - | 0,024 TL | 0,1201 TL |
| Xiaomi: MiMo-V2-Pro | - | 0,0601 TL | 0,1802 TL |
| Z.ai: GLM 4 32B | - | 0,006 TL | 0,006 TL |
| Z.ai: GLM 4.5 | - | 0,036 TL | 0,1321 TL |
| Z.ai: GLM 4.5 Air | - | 0,0078 TL | 0,0511 TL |
| Z.ai: GLM 4.5V | - | 0,036 TL | 0,1081 TL |
| Z.ai: GLM 4.6 | - | 0,0234 TL | 0,1141 TL |
| Z.ai: GLM 4.6V | - | 0,018 TL | 0,0541 TL |
| Z.ai: GLM 4.7 | - | 0,0234 TL | 0,1051 TL |
| Z.ai: GLM 4.7 Flash | - | 0,0036 TL | 0,024 TL |
| Z.ai: GLM 5 | - | 0,0432 TL | 0,1381 TL |
| Z.ai: GLM 5 Turbo | - | 0,0721 TL | 0,2402 TL |
| Anthropic: Claude Opus 4.6 (Fast) | - | 1,8018 TL | 9,009 TL |
| Anthropic: Claude Opus 4.7 | - | 0,3003 TL | 1,5015 TL |
| Arcee AI: Trinity Large Thinking | - | 0,0132 TL | 0,0511 TL |
| Google: Gemma 4 26B A4B | - | 0,0042 TL | 0,021 TL |
| Google: Gemma 4 31B | - | 0,0078 TL | 0,0228 TL |
| MoonshotAI: Kimi K2.6 | - | 0,036 TL | 0,1682 TL |
| Qwen: Qwen3.6 Plus | - | 0,0195 TL | 0,1171 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| Reka Flash 3 | - | 0,006 TL | 0,012 TL |
| xAI: Grok 4.20 | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent | - | 0,1201 TL | 0,3604 TL |
| Z.ai: GLM 5.1 | - | 0,0419 TL | 0,2643 TL |
| Z.ai: GLM 5V Turbo | - | 0,0721 TL | 0,2402 TL |
Seo Meta Generator
| Model | Türkçe Kalitesi | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|---|
| OpenAI: GPT-5.3 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.3-Codex | - | 0,1051 TL | 0,8408 TL |
| xAI: Grok 4.1 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok Code Fast 1 | - | 0,012 TL | 0,0901 TL |
| Anthropic: Claude Sonnet 4.6 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4.6 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Haiku 4.5 | - | 0,0601 TL | 0,3003 TL |
| AI21: Jamba Large 1.7 | - | 0,1201 TL | 0,4805 TL |
| AionLabs: Aion-1.0 | - | 0,2402 TL | 0,4805 TL |
| AionLabs: Aion-1.0-Mini | - | 0,042 TL | 0,0841 TL |
| AionLabs: Aion-2.0 | - | 0,048 TL | 0,0961 TL |
| AionLabs: Aion-RP 1.0 (8B) | - | 0,048 TL | 0,0961 TL |
| AlfredPros: CodeLLaMa 7B Instruct Solidity | - | 0,048 TL | 0,0721 TL |
| AllenAI: Olmo 2 32B Instruct | - | 0,003 TL | 0,012 TL |
| AllenAI: Olmo 3 32B Think | - | 0,009 TL | 0,03 TL |
| AllenAI: Olmo 3.1 32B Instruct | - | 0,012 TL | 0,036 TL |
| AllenAI: Olmo 3.1 32B Think | - | 0,009 TL | 0,03 TL |
| Amazon: Nova 2 Lite | - | 0,018 TL | 0,1502 TL |
| Amazon: Nova Lite 1.0 | - | 0,0036 TL | 0,0144 TL |
| Amazon: Nova Micro 1.0 | - | 0,0021 TL | 0,0084 TL |
| Amazon: Nova Premier 1.0 | - | 0,1502 TL | 0,7507 TL |
| Amazon: Nova Pro 1.0 | - | 0,048 TL | 0,1922 TL |
| Anthropic: Claude 3 Haiku | - | 0,015 TL | 0,0751 TL |
| Anthropic: Claude 3.5 Haiku | - | 0,048 TL | 0,2402 TL |
| Anthropic: Claude 3.5 Sonnet | - | 0,3604 TL | 1,8018 TL |
| Anthropic: Claude 3.7 Sonnet | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude 3.7 Sonnet (thinking) | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.1 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.5 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Sonnet 4 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Sonnet 4.5 | - | 0,1802 TL | 0,9009 TL |
| Arcee AI: Coder Large | - | 0,03 TL | 0,048 TL |
| Arcee AI: Maestro Reasoning | - | 0,0541 TL | 0,1982 TL |
| Arcee AI: Spotlight | - | 0,0108 TL | 0,0108 TL |
| Arcee AI: Trinity Mini | - | 0,0027 TL | 0,009 TL |
| Arcee AI: Virtuoso Large | - | 0,045 TL | 0,0721 TL |
| Auto Router | - | -60.060,00 TL | -60.060,00 TL |
| Baidu: ERNIE 4.5 21B A3B | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 21B A3B Thinking | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 300B A47B | - | 0,0168 TL | 0,0661 TL |
| Baidu: ERNIE 4.5 VL 28B A3B | - | 0,0084 TL | 0,0336 TL |
| Baidu: ERNIE 4.5 VL 424B A47B | - | 0,0252 TL | 0,0751 TL |
| Body Builder (beta) | - | -60.060,00 TL | -60.060,00 TL |
| ByteDance Seed: Seed 1.6 | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed 1.6 Flash | - | 0,0045 TL | 0,018 TL |
| ByteDance Seed: Seed-2.0-Lite | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed-2.0-Mini | - | 0,006 TL | 0,024 TL |
| ByteDance: UI-TARS 7B | - | 0,006 TL | 0,012 TL |
| Cohere: Command A | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R (08-2024) | - | 0,009 TL | 0,036 TL |
| Cohere: Command R+ (08-2024) | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R7B (12-2024) | - | 0,0023 TL | 0,009 TL |
| Deep Cogito: Cogito v2.1 671B | - | 0,0751 TL | 0,0751 TL |
| DeepSeek: DeepSeek V3 | - | 0,0192 TL | 0,0535 TL |
| DeepSeek: DeepSeek V3 0324 | - | 0,012 TL | 0,0462 TL |
| DeepSeek: DeepSeek V3.1 | - | 0,009 TL | 0,045 TL |
| DeepSeek: DeepSeek V3.1 Terminus | - | 0,0126 TL | 0,0474 TL |
| DeepSeek: DeepSeek V3.2 | - | 0,0156 TL | 0,0228 TL |
| DeepSeek: DeepSeek V3.2 Exp | - | 0,0162 TL | 0,0246 TL |
| DeepSeek: DeepSeek V3.2 Speciale | - | 0,024 TL | 0,0721 TL |
| DeepSeek: R1 | - | 0,042 TL | 0,1502 TL |
| DeepSeek: R1 0528 | - | 0,027 TL | 0,1291 TL |
| DeepSeek: R1 Distill Llama 70B | - | 0,042 TL | 0,048 TL |
| DeepSeek: R1 Distill Qwen 32B | - | 0,0174 TL | 0,0174 TL |
| EleutherAI: Llemma 7b | - | 0,048 TL | 0,0721 TL |
| EssentialAI: Rnj 1 Instruct | - | 0,009 TL | 0,009 TL |
| Goliath 120B | - | 0,2252 TL | 0,4505 TL |
| Google: Gemini 2.0 Flash | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.0 Flash Lite | - | 0,0045 TL | 0,018 TL |
| Google: Gemini 2.5 Flash | - | 0,018 TL | 0,1502 TL |
| Google: Gemini 2.5 Flash Lite | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Flash Lite Preview 09-2025 | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Pro | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 05-06 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 06-05 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 3 Flash Preview | - | 0,03 TL | 0,1802 TL |
| Google: Gemini 3.1 Flash Lite Preview | - | 0,015 TL | 0,0901 TL |
| Google: Gemini 3.1 Pro Preview | - | 0,1201 TL | 0,7207 TL |
| Google: Gemini 3.1 Pro Preview Custom Tools | - | 0,1201 TL | 0,7207 TL |
| Google: Gemma 2 27B | - | 0,039 TL | 0,039 TL |
| Google: Gemma 2 9B | - | 0,0018 TL | 0,0054 TL |
| Google: Gemma 3 12B | - | 0,0024 TL | 0,0078 TL |
| Google: Gemma 3 27B | - | 0,0048 TL | 0,0096 TL |
| Google: Gemma 3 4B | - | 0,0024 TL | 0,0048 TL |
| Google: Gemma 3n 4B | - | 0,0012 TL | 0,0024 TL |
| Google: Nano Banana (Gemini 2.5 Flash Image) | - | 0,018 TL | 0,1502 TL |
| Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview) | - | 0,03 TL | 0,1802 TL |
| Google: Nano Banana Pro (Gemini 3 Pro Image Preview) | - | 0,1201 TL | 0,7207 TL |
| IBM: Granite 4.0 Micro | - | 0,001 TL | 0,0066 TL |
| Inception: Mercury | - | 0,015 TL | 0,045 TL |
| Inception: Mercury 2 | - | 0,015 TL | 0,045 TL |
| Inception: Mercury Coder | - | 0,015 TL | 0,045 TL |
| Inflection: Inflection 3 Pi | - | 0,1502 TL | 0,6006 TL |
| Inflection: Inflection 3 Productivity | - | 0,1502 TL | 0,6006 TL |
| Kwaipilot: KAT-Coder-Pro V1 | - | 0,0124 TL | 0,0497 TL |
| Kwaipilot: KAT-Coder-Pro V2 | - | 0,018 TL | 0,0721 TL |
| LiquidAI: LFM2-2.6B | - | 0,0006 TL | 0,0012 TL |
| LiquidAI: LFM2-24B-A2B | - | 0,0018 TL | 0,0072 TL |
| LiquidAI: LFM2-8B-A1B | - | 0,0006 TL | 0,0012 TL |
| Llama Guard 3 8B | - | 0,0012 TL | 0,0036 TL |
| Magnum v4 72B | - | 0,1802 TL | 0,3003 TL |
| Mancer: Weaver (alpha) | - | 0,045 TL | 0,0601 TL |
| Meituan: LongCat Flash Chat | - | 0,012 TL | 0,048 TL |
| Meta: Llama 3 70B Instruct | - | 0,0306 TL | 0,0444 TL |
| Meta: Llama 3 8B Instruct | - | 0,0018 TL | 0,0024 TL |
| Meta: Llama 3.1 70B Instruct | - | 0,024 TL | 0,024 TL |
| Meta: Llama 3.1 8B Instruct | - | 0,0012 TL | 0,003 TL |
| Meta: Llama 3.2 11B Vision Instruct | - | 0,0029 TL | 0,0029 TL |
| Meta: Llama 3.2 1B Instruct | - | 0,0016 TL | 0,012 TL |
| Meta: Llama 3.2 3B Instruct | - | 0,0031 TL | 0,0204 TL |
| Meta: Llama 3.3 70B Instruct | - | 0,006 TL | 0,0192 TL |
| Meta: Llama 4 Maverick | - | 0,009 TL | 0,036 TL |
| Meta: Llama 4 Scout | - | 0,0048 TL | 0,018 TL |
| Meta: Llama Guard 4 12B | - | 0,0108 TL | 0,0108 TL |
| Microsoft: Phi 4 | - | 0,0039 TL | 0,0084 TL |
| MiniMax: MiniMax M1 | - | 0,024 TL | 0,1321 TL |
| MiniMax: MiniMax M2 | - | 0,0153 TL | 0,0601 TL |
| MiniMax: MiniMax M2-her | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax M2.1 | - | 0,0162 TL | 0,0571 TL |
| MiniMax: MiniMax M2.5 | - | 0,0114 TL | 0,0691 TL |
| MiniMax: MiniMax M2.7 | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax-01 | - | 0,012 TL | 0,0661 TL |
| Mistral Large | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2407 | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Codestral 2508 | - | 0,018 TL | 0,0541 TL |
| Mistral: Devstral 2 2512 | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Medium | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Small 1.1 | - | 0,006 TL | 0,018 TL |
| Mistral: Ministral 3 14B 2512 | - | 0,012 TL | 0,012 TL |
| Mistral: Ministral 3 3B 2512 | - | 0,006 TL | 0,006 TL |
| Mistral: Ministral 3 8B 2512 | - | 0,009 TL | 0,009 TL |
| Mistral: Mistral 7B Instruct v0.1 | - | 0,0066 TL | 0,0114 TL |
| Mistral: Mistral Large 3 2512 | - | 0,03 TL | 0,0901 TL |
| Mistral: Mistral Medium 3 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Medium 3.1 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Nemo | - | 0,0012 TL | 0,0024 TL |
| Mistral: Mistral Small 3 | - | 0,003 TL | 0,0048 TL |
| Mistral: Mistral Small 3.1 24B | - | 0,0018 TL | 0,0066 TL |
| Mistral: Mistral Small 3.2 24B | - | 0,0045 TL | 0,012 TL |
| Mistral: Mistral Small 4 | - | 0,009 TL | 0,036 TL |
| Mistral: Mistral Small Creative | - | 0,006 TL | 0,018 TL |
| Mistral: Mixtral 8x22B Instruct | - | 0,1201 TL | 0,3604 TL |
| Mistral: Mixtral 8x7B Instruct | - | 0,0324 TL | 0,0324 TL |
| Mistral: Pixtral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Saba | - | 0,012 TL | 0,036 TL |
| Mistral: Voxtral Small 24B 2507 | - | 0,006 TL | 0,018 TL |
| MoonshotAI: Kimi K2 0711 | - | 0,0342 TL | 0,1381 TL |
| MoonshotAI: Kimi K2 0905 | - | 0,024 TL | 0,1201 TL |
| MoonshotAI: Kimi K2 Thinking | - | 0,0282 TL | 0,1201 TL |
| MoonshotAI: Kimi K2.5 | - | 0,0252 TL | 0,1321 TL |
| Morph: Morph V3 Fast | - | 0,048 TL | 0,0721 TL |
| Morph: Morph V3 Large | - | 0,0541 TL | 0,1141 TL |
| MythoMax 13B | - | 0,0036 TL | 0,0036 TL |
| Nex AGI: DeepSeek V3.1 Nex N1 | - | 0,0081 TL | 0,03 TL |
| Nous: Hermes 3 405B Instruct | - | 0,0601 TL | 0,0601 TL |
| Nous: Hermes 3 70B Instruct | - | 0,018 TL | 0,018 TL |
| Nous: Hermes 4 405B | - | 0,0601 TL | 0,1802 TL |
| Nous: Hermes 4 70B | - | 0,0078 TL | 0,024 TL |
| NousResearch: Hermes 2 Pro - Llama-3 8B | - | 0,0084 TL | 0,0084 TL |
| NVIDIA: Llama 3.1 Nemotron 70B Instruct | - | 0,0721 TL | 0,0721 TL |
| NVIDIA: Llama 3.1 Nemotron Ultra 253B v1 | - | 0,036 TL | 0,1081 TL |
| NVIDIA: Llama 3.3 Nemotron Super 49B V1.5 | - | 0,006 TL | 0,024 TL |
| NVIDIA: Nemotron 3 Nano 30B A3B | - | 0,003 TL | 0,012 TL |
| NVIDIA: Nemotron 3 Super | - | 0,006 TL | 0,03 TL |
| NVIDIA: Nemotron Nano 12B 2 VL | - | 0,012 TL | 0,036 TL |
| NVIDIA: Nemotron Nano 9B V2 | - | 0,0024 TL | 0,0096 TL |
| OpenAI: GPT Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT Audio Mini | - | 0,036 TL | 0,1441 TL |
| OpenAI: GPT-3.5 Turbo | - | 0,03 TL | 0,0901 TL |
| OpenAI: GPT-3.5 Turbo (older v0613) | - | 0,0601 TL | 0,1201 TL |
| OpenAI: GPT-3.5 Turbo 16k | - | 0,1802 TL | 0,2402 TL |
| OpenAI: GPT-3.5 Turbo Instruct | - | 0,0901 TL | 0,1201 TL |
| OpenAI: GPT-4 | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 (older v0314) | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 Turbo | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo (older v1106) | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo Preview | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4.1 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: GPT-4.1 Mini | - | 0,024 TL | 0,0961 TL |
| OpenAI: GPT-4.1 Nano | - | 0,006 TL | 0,024 TL |
| OpenAI: GPT-4o | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-05-13) | - | 0,3003 TL | 0,9009 TL |
| OpenAI: GPT-4o (2024-08-06) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-11-20) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (extended) | - | 0,3604 TL | 1,0811 TL |
| OpenAI: GPT-4o Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o Search Preview | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o-mini | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini (2024-07-18) | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini Search Preview | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-5 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Image | - | 0,6006 TL | 0,6006 TL |
| OpenAI: GPT-5 Image Mini | - | 0,1502 TL | 0,1201 TL |
| OpenAI: GPT-5 Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5 Nano | - | 0,003 TL | 0,024 TL |
| OpenAI: GPT-5 Pro | - | 0,9009 TL | 7,2072 TL |
| OpenAI: GPT-5.1 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Max | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5.2 | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Pro | - | 1,2613 TL | 10,0901 TL |
| OpenAI: GPT-5.2-Codex | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.4 | - | 0,1502 TL | 0,9009 TL |
| OpenAI: GPT-5.4 Mini | - | 0,045 TL | 0,2703 TL |
| OpenAI: GPT-5.4 Nano | - | 0,012 TL | 0,0751 TL |
| OpenAI: GPT-5.4 Pro | - | 1,8018 TL | 10,8108 TL |
| OpenAI: gpt-oss-120b | - | 0,0023 TL | 0,0114 TL |
| OpenAI: gpt-oss-20b | - | 0,0018 TL | 0,0066 TL |
| OpenAI: gpt-oss-safeguard-20b | - | 0,0045 TL | 0,018 TL |
| OpenAI: o1 | - | 0,9009 TL | 3,6036 TL |
| OpenAI: o1-pro | - | 9,009 TL | 36,036 TL |
| OpenAI: o3 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o3 Deep Research | - | 0,6006 TL | 2,4024 TL |
| OpenAI: o3 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Mini High | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Pro | - | 1,2012 TL | 4,8048 TL |
| OpenAI: o4 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o4 Mini Deep Research | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o4 Mini High | - | 0,0661 TL | 0,2643 TL |
| Perplexity: Sonar | - | 0,0601 TL | 0,0601 TL |
| Perplexity: Sonar Deep Research | - | 0,1201 TL | 0,4805 TL |
| Perplexity: Sonar Pro | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Pro Search | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Reasoning Pro | - | 0,1201 TL | 0,4805 TL |
| Prime Intellect: INTELLECT-3 | - | 0,012 TL | 0,0661 TL |
| Qwen: Qwen Plus 0728 | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen Plus 0728 (thinking) | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen VL Max | - | 0,0312 TL | 0,1249 TL |
| Qwen: Qwen VL Plus | - | 0,0082 TL | 0,0246 TL |
| Qwen: Qwen-Max | - | 0,0625 TL | 0,2498 TL |
| Qwen: Qwen-Plus | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen-Turbo | - | 0,002 TL | 0,0078 TL |
| Qwen: Qwen2.5 7B Instruct | - | 0,0024 TL | 0,006 TL |
| Qwen: Qwen2.5 Coder 7B Instruct | - | 0,0018 TL | 0,0054 TL |
| Qwen: Qwen2.5 VL 32B Instruct | - | 0,012 TL | 0,036 TL |
| Qwen: Qwen2.5 VL 72B Instruct | - | 0,048 TL | 0,048 TL |
| Qwen: Qwen3 14B | - | 0,0036 TL | 0,0144 TL |
| Qwen: Qwen3 235B A22B | - | 0,0273 TL | 0,1093 TL |
| Qwen: Qwen3 235B A22B Instruct 2507 | - | 0,0043 TL | 0,006 TL |
| Qwen: Qwen3 235B A22B Thinking 2507 | - | 0,009 TL | 0,0898 TL |
| Qwen: Qwen3 30B A3B | - | 0,0048 TL | 0,0168 TL |
| Qwen: Qwen3 30B A3B Instruct 2507 | - | 0,0054 TL | 0,018 TL |
| Qwen: Qwen3 30B A3B Thinking 2507 | - | 0,0048 TL | 0,024 TL |
| Qwen: Qwen3 32B | - | 0,0048 TL | 0,0144 TL |
| Qwen: Qwen3 8B | - | 0,003 TL | 0,024 TL |
| Qwen: Qwen3 Coder 30B A3B Instruct | - | 0,0042 TL | 0,0162 TL |
| Qwen: Qwen3 Coder 480B A35B | - | 0,0132 TL | 0,0601 TL |
| Qwen: Qwen3 Coder Flash | - | 0,0117 TL | 0,0586 TL |
| Qwen: Qwen3 Coder Next | - | 0,0072 TL | 0,045 TL |
| Qwen: Qwen3 Coder Plus | - | 0,039 TL | 0,1952 TL |
| Qwen: Qwen3 Max | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Max Thinking | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Next 80B A3B Instruct | - | 0,0054 TL | 0,0661 TL |
| Qwen: Qwen3 Next 80B A3B Thinking | - | 0,0059 TL | 0,0468 TL |
| Qwen: Qwen3 VL 235B A22B Instruct | - | 0,012 TL | 0,0529 TL |
| Qwen: Qwen3 VL 235B A22B Thinking | - | 0,0156 TL | 0,1562 TL |
| Qwen: Qwen3 VL 30B A3B Instruct | - | 0,0078 TL | 0,0312 TL |
| Qwen: Qwen3 VL 30B A3B Thinking | - | 0,0078 TL | 0,0937 TL |
| Qwen: Qwen3 VL 32B Instruct | - | 0,0062 TL | 0,025 TL |
| Qwen: Qwen3 VL 8B Instruct | - | 0,0048 TL | 0,03 TL |
| Qwen: Qwen3 VL 8B Thinking | - | 0,007 TL | 0,082 TL |
| Qwen: Qwen3.5 397B A17B | - | 0,0234 TL | 0,1405 TL |
| Qwen: Qwen3.5 Plus 2026-02-15 | - | 0,0156 TL | 0,0937 TL |
| Qwen: Qwen3.5-122B-A10B | - | 0,0156 TL | 0,1249 TL |
| Qwen: Qwen3.5-27B | - | 0,0117 TL | 0,0937 TL |
| Qwen: Qwen3.5-35B-A3B | - | 0,0098 TL | 0,0781 TL |
| Qwen: Qwen3.5-9B | - | 0,003 TL | 0,009 TL |
| Qwen: Qwen3.5-Flash | - | 0,0039 TL | 0,0156 TL |
| Qwen: QwQ 32B | - | 0,009 TL | 0,0348 TL |
| Qwen2.5 72B Instruct | - | 0,0072 TL | 0,0234 TL |
| Qwen2.5 Coder 32B Instruct | - | 0,0396 TL | 0,0601 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| Relace: Relace Apply 3 | - | 0,0511 TL | 0,0751 TL |
| Relace: Relace Search | - | 0,0601 TL | 0,1802 TL |
| ReMM SLERP 13B | - | 0,027 TL | 0,039 TL |
| Sao10K: Llama 3 8B Lunaris | - | 0,0024 TL | 0,003 TL |
| Sao10k: Llama 3 Euryale 70B v2.1 | - | 0,0889 TL | 0,0889 TL |
| Sao10K: Llama 3.1 70B Hanami x1 | - | 0,1802 TL | 0,1802 TL |
| Sao10K: Llama 3.1 Euryale 70B v2.2 | - | 0,0511 TL | 0,0511 TL |
| Sao10K: Llama 3.3 Euryale 70B | - | 0,039 TL | 0,045 TL |
| StepFun: Step 3.5 Flash | - | 0,006 TL | 0,018 TL |
| Switchpoint Router | - | 0,0511 TL | 0,2042 TL |
| Tencent: Hunyuan A13B Instruct | - | 0,0084 TL | 0,0342 TL |
| TheDrummer: Cydonia 24B V4.1 | - | 0,018 TL | 0,03 TL |
| TheDrummer: Rocinante 12B | - | 0,0102 TL | 0,0258 TL |
| TheDrummer: Skyfall 36B V2 | - | 0,033 TL | 0,048 TL |
| TheDrummer: UnslopNemo 12B | - | 0,024 TL | 0,024 TL |
| TNG: DeepSeek R1T2 Chimera | - | 0,018 TL | 0,0661 TL |
| Tongyi DeepResearch 30B A3B | - | 0,0054 TL | 0,027 TL |
| Upstage: Solar Pro 3 | - | 0,009 TL | 0,036 TL |
| WizardLM-2 8x22B | - | 0,0372 TL | 0,0372 TL |
| Writer: Palmyra X5 | - | 0,036 TL | 0,3604 TL |
| xAI: Grok 3 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Beta | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Mini | - | 0,018 TL | 0,03 TL |
| xAI: Grok 3 Mini Beta | - | 0,018 TL | 0,03 TL |
| xAI: Grok 4 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 4 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok 4.20 Beta | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent Beta | - | 0,1201 TL | 0,3604 TL |
| Xiaomi: MiMo-V2-Flash | - | 0,0054 TL | 0,0174 TL |
| Xiaomi: MiMo-V2-Omni | - | 0,024 TL | 0,1201 TL |
| Xiaomi: MiMo-V2-Pro | - | 0,0601 TL | 0,1802 TL |
| Z.ai: GLM 4 32B | - | 0,006 TL | 0,006 TL |
| Z.ai: GLM 4.5 | - | 0,036 TL | 0,1321 TL |
| Z.ai: GLM 4.5 Air | - | 0,0078 TL | 0,0511 TL |
| Z.ai: GLM 4.5V | - | 0,036 TL | 0,1081 TL |
| Z.ai: GLM 4.6 | - | 0,0234 TL | 0,1141 TL |
| Z.ai: GLM 4.6V | - | 0,018 TL | 0,0541 TL |
| Z.ai: GLM 4.7 | - | 0,0234 TL | 0,1051 TL |
| Z.ai: GLM 4.7 Flash | - | 0,0036 TL | 0,024 TL |
| Z.ai: GLM 5 | - | 0,0432 TL | 0,1381 TL |
| Z.ai: GLM 5 Turbo | - | 0,0721 TL | 0,2402 TL |
Medya SEO Oluşturucu
| Model | Türkçe Kalitesi | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|---|
| OpenAI: GPT-5.3 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.3-Codex | - | 0,1051 TL | 0,8408 TL |
| xAI: Grok 4.1 Fast | - | 0,012 TL | 0,03 TL |
| Anthropic: Claude Sonnet 4.6 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4.6 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Haiku 4.5 | - | 0,0601 TL | 0,3003 TL |
| Amazon: Nova 2 Lite | - | 0,018 TL | 0,1502 TL |
| Amazon: Nova Lite 1.0 | - | 0,0036 TL | 0,0144 TL |
| Amazon: Nova Premier 1.0 | - | 0,1502 TL | 0,7507 TL |
| Amazon: Nova Pro 1.0 | - | 0,048 TL | 0,1922 TL |
| Anthropic: Claude 3 Haiku | - | 0,015 TL | 0,0751 TL |
| Anthropic: Claude 3.5 Haiku | - | 0,048 TL | 0,2402 TL |
| Anthropic: Claude 3.5 Sonnet | - | 0,3604 TL | 1,8018 TL |
| Anthropic: Claude 3.7 Sonnet | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude 3.7 Sonnet (thinking) | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.1 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.5 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Sonnet 4 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Sonnet 4.5 | - | 0,1802 TL | 0,9009 TL |
| Arcee AI: Spotlight | - | 0,0108 TL | 0,0108 TL |
| Auto Router | - | -60.060,00 TL | -60.060,00 TL |
| Baidu: ERNIE 4.5 VL 28B A3B | - | 0,0084 TL | 0,0336 TL |
| Baidu: ERNIE 4.5 VL 424B A47B | - | 0,0252 TL | 0,0751 TL |
| ByteDance Seed: Seed 1.6 | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed 1.6 Flash | - | 0,0045 TL | 0,018 TL |
| ByteDance Seed: Seed-2.0-Lite | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed-2.0-Mini | - | 0,006 TL | 0,024 TL |
| ByteDance: UI-TARS 7B | - | 0,006 TL | 0,012 TL |
| Google: Gemini 2.0 Flash | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.0 Flash Lite | - | 0,0045 TL | 0,018 TL |
| Google: Gemini 2.5 Flash | - | 0,018 TL | 0,1502 TL |
| Google: Gemini 2.5 Flash Lite | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Flash Lite Preview 09-2025 | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Pro | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 05-06 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 06-05 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 3 Flash Preview | - | 0,03 TL | 0,1802 TL |
| Google: Gemini 3.1 Flash Lite Preview | - | 0,015 TL | 0,0901 TL |
| Google: Gemini 3.1 Pro Preview | - | 0,1201 TL | 0,7207 TL |
| Google: Gemini 3.1 Pro Preview Custom Tools | - | 0,1201 TL | 0,7207 TL |
| Google: Gemma 3 12B | - | 0,0024 TL | 0,0078 TL |
| Google: Gemma 3 27B | - | 0,0048 TL | 0,0096 TL |
| Google: Gemma 3 4B | - | 0,0024 TL | 0,0048 TL |
| Google: Nano Banana (Gemini 2.5 Flash Image) | - | 0,018 TL | 0,1502 TL |
| Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview) | - | 0,03 TL | 0,1802 TL |
| Google: Nano Banana Pro (Gemini 3 Pro Image Preview) | - | 0,1201 TL | 0,7207 TL |
| Meta: Llama 3.2 11B Vision Instruct | - | 0,0029 TL | 0,0029 TL |
| Meta: Llama 4 Maverick | - | 0,009 TL | 0,036 TL |
| Meta: Llama 4 Scout | - | 0,0048 TL | 0,018 TL |
| Meta: Llama Guard 4 12B | - | 0,0108 TL | 0,0108 TL |
| MiniMax: MiniMax-01 | - | 0,012 TL | 0,0661 TL |
| Mistral: Ministral 3 14B 2512 | - | 0,012 TL | 0,012 TL |
| Mistral: Ministral 3 3B 2512 | - | 0,006 TL | 0,006 TL |
| Mistral: Ministral 3 8B 2512 | - | 0,009 TL | 0,009 TL |
| Mistral: Mistral Large 3 2512 | - | 0,03 TL | 0,0901 TL |
| Mistral: Mistral Medium 3 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Medium 3.1 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Small 3.1 24B | - | 0,0018 TL | 0,0066 TL |
| Mistral: Mistral Small 3.2 24B | - | 0,0045 TL | 0,012 TL |
| Mistral: Mistral Small 4 | - | 0,009 TL | 0,036 TL |
| Mistral: Pixtral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| MoonshotAI: Kimi K2.5 | - | 0,0252 TL | 0,1321 TL |
| NVIDIA: Nemotron Nano 12B 2 VL | - | 0,012 TL | 0,036 TL |
| OpenAI: GPT-4 Turbo | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4.1 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: GPT-4.1 Mini | - | 0,024 TL | 0,0961 TL |
| OpenAI: GPT-4.1 Nano | - | 0,006 TL | 0,024 TL |
| OpenAI: GPT-4o | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-05-13) | - | 0,3003 TL | 0,9009 TL |
| OpenAI: GPT-4o (2024-08-06) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-11-20) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (extended) | - | 0,3604 TL | 1,0811 TL |
| OpenAI: GPT-4o-mini | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini (2024-07-18) | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-5 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Image | - | 0,6006 TL | 0,6006 TL |
| OpenAI: GPT-5 Image Mini | - | 0,1502 TL | 0,1201 TL |
| OpenAI: GPT-5 Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5 Nano | - | 0,003 TL | 0,024 TL |
| OpenAI: GPT-5 Pro | - | 0,9009 TL | 7,2072 TL |
| OpenAI: GPT-5.1 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Max | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5.2 | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Pro | - | 1,2613 TL | 10,0901 TL |
| OpenAI: GPT-5.2-Codex | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.4 | - | 0,1502 TL | 0,9009 TL |
| OpenAI: GPT-5.4 Mini | - | 0,045 TL | 0,2703 TL |
| OpenAI: GPT-5.4 Nano | - | 0,012 TL | 0,0751 TL |
| OpenAI: GPT-5.4 Pro | - | 1,8018 TL | 10,8108 TL |
| OpenAI: o1 | - | 0,9009 TL | 3,6036 TL |
| OpenAI: o1-pro | - | 9,009 TL | 36,036 TL |
| OpenAI: o3 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o3 Deep Research | - | 0,6006 TL | 2,4024 TL |
| OpenAI: o3 Pro | - | 1,2012 TL | 4,8048 TL |
| OpenAI: o4 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o4 Mini Deep Research | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o4 Mini High | - | 0,0661 TL | 0,2643 TL |
| Perplexity: Sonar | - | 0,0601 TL | 0,0601 TL |
| Perplexity: Sonar Pro | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Pro Search | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Reasoning Pro | - | 0,1201 TL | 0,4805 TL |
| Qwen: Qwen VL Max | - | 0,0312 TL | 0,1249 TL |
| Qwen: Qwen VL Plus | - | 0,0082 TL | 0,0246 TL |
| Qwen: Qwen2.5 VL 32B Instruct | - | 0,012 TL | 0,036 TL |
| Qwen: Qwen2.5 VL 72B Instruct | - | 0,048 TL | 0,048 TL |
| Qwen: Qwen3 VL 235B A22B Instruct | - | 0,012 TL | 0,0529 TL |
| Qwen: Qwen3 VL 235B A22B Thinking | - | 0,0156 TL | 0,1562 TL |
| Qwen: Qwen3 VL 30B A3B Instruct | - | 0,0078 TL | 0,0312 TL |
| Qwen: Qwen3 VL 30B A3B Thinking | - | 0,0078 TL | 0,0937 TL |
| Qwen: Qwen3 VL 32B Instruct | - | 0,0062 TL | 0,025 TL |
| Qwen: Qwen3 VL 8B Instruct | - | 0,0048 TL | 0,03 TL |
| Qwen: Qwen3 VL 8B Thinking | - | 0,007 TL | 0,082 TL |
| Qwen: Qwen3.5 397B A17B | - | 0,0234 TL | 0,1405 TL |
| Qwen: Qwen3.5 Plus 2026-02-15 | - | 0,0156 TL | 0,0937 TL |
| Qwen: Qwen3.5-122B-A10B | - | 0,0156 TL | 0,1249 TL |
| Qwen: Qwen3.5-27B | - | 0,0117 TL | 0,0937 TL |
| Qwen: Qwen3.5-35B-A3B | - | 0,0098 TL | 0,0781 TL |
| Qwen: Qwen3.5-9B | - | 0,003 TL | 0,009 TL |
| Qwen: Qwen3.5-Flash | - | 0,0039 TL | 0,0156 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| xAI: Grok 4 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 4 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok 4.20 Beta | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent Beta | - | 0,1201 TL | 0,3604 TL |
| Xiaomi: MiMo-V2-Omni | - | 0,024 TL | 0,1201 TL |
| Z.ai: GLM 4.5V | - | 0,036 TL | 0,1081 TL |
| Z.ai: GLM 4.6V | - | 0,018 TL | 0,0541 TL |
Özet Oluştur
| Model | Türkçe Kalitesi | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|---|
| OpenAI: GPT-5.3 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.3-Codex | - | 0,1051 TL | 0,8408 TL |
| xAI: Grok 4.1 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok Code Fast 1 | - | 0,012 TL | 0,0901 TL |
| Anthropic: Claude Sonnet 4.6 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4.6 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Haiku 4.5 | - | 0,0601 TL | 0,3003 TL |
| AI21: Jamba Large 1.7 | - | 0,1201 TL | 0,4805 TL |
| AionLabs: Aion-1.0 | - | 0,2402 TL | 0,4805 TL |
| AionLabs: Aion-1.0-Mini | - | 0,042 TL | 0,0841 TL |
| AionLabs: Aion-2.0 | - | 0,048 TL | 0,0961 TL |
| AionLabs: Aion-RP 1.0 (8B) | - | 0,048 TL | 0,0961 TL |
| AlfredPros: CodeLLaMa 7B Instruct Solidity | - | 0,048 TL | 0,0721 TL |
| AllenAI: Olmo 2 32B Instruct | - | 0,003 TL | 0,012 TL |
| AllenAI: Olmo 3 32B Think | - | 0,009 TL | 0,03 TL |
| AllenAI: Olmo 3.1 32B Instruct | - | 0,012 TL | 0,036 TL |
| AllenAI: Olmo 3.1 32B Think | - | 0,009 TL | 0,03 TL |
| Amazon: Nova 2 Lite | - | 0,018 TL | 0,1502 TL |
| Amazon: Nova Lite 1.0 | - | 0,0036 TL | 0,0144 TL |
| Amazon: Nova Micro 1.0 | - | 0,0021 TL | 0,0084 TL |
| Amazon: Nova Premier 1.0 | - | 0,1502 TL | 0,7507 TL |
| Amazon: Nova Pro 1.0 | - | 0,048 TL | 0,1922 TL |
| Anthropic: Claude 3 Haiku | - | 0,015 TL | 0,0751 TL |
| Anthropic: Claude 3.5 Haiku | - | 0,048 TL | 0,2402 TL |
| Anthropic: Claude 3.5 Sonnet | - | 0,3604 TL | 1,8018 TL |
| Anthropic: Claude 3.7 Sonnet | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude 3.7 Sonnet (thinking) | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.1 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.5 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Sonnet 4 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Sonnet 4.5 | - | 0,1802 TL | 0,9009 TL |
| Arcee AI: Coder Large | - | 0,03 TL | 0,048 TL |
| Arcee AI: Maestro Reasoning | - | 0,0541 TL | 0,1982 TL |
| Arcee AI: Spotlight | - | 0,0108 TL | 0,0108 TL |
| Arcee AI: Trinity Mini | - | 0,0027 TL | 0,009 TL |
| Arcee AI: Virtuoso Large | - | 0,045 TL | 0,0721 TL |
| Auto Router | - | -60.060,00 TL | -60.060,00 TL |
| Baidu: ERNIE 4.5 21B A3B | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 21B A3B Thinking | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 300B A47B | - | 0,0168 TL | 0,0661 TL |
| Baidu: ERNIE 4.5 VL 28B A3B | - | 0,0084 TL | 0,0336 TL |
| Baidu: ERNIE 4.5 VL 424B A47B | - | 0,0252 TL | 0,0751 TL |
| Body Builder (beta) | - | -60.060,00 TL | -60.060,00 TL |
| ByteDance Seed: Seed 1.6 | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed 1.6 Flash | - | 0,0045 TL | 0,018 TL |
| ByteDance Seed: Seed-2.0-Lite | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed-2.0-Mini | - | 0,006 TL | 0,024 TL |
| ByteDance: UI-TARS 7B | - | 0,006 TL | 0,012 TL |
| Cohere: Command A | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R (08-2024) | - | 0,009 TL | 0,036 TL |
| Cohere: Command R+ (08-2024) | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R7B (12-2024) | - | 0,0023 TL | 0,009 TL |
| Deep Cogito: Cogito v2.1 671B | - | 0,0751 TL | 0,0751 TL |
| DeepSeek: DeepSeek V3 | - | 0,0192 TL | 0,0535 TL |
| DeepSeek: DeepSeek V3 0324 | - | 0,012 TL | 0,0462 TL |
| DeepSeek: DeepSeek V3.1 | - | 0,009 TL | 0,045 TL |
| DeepSeek: DeepSeek V3.1 Terminus | - | 0,0126 TL | 0,0474 TL |
| DeepSeek: DeepSeek V3.2 | - | 0,0156 TL | 0,0228 TL |
| DeepSeek: DeepSeek V3.2 Exp | - | 0,0162 TL | 0,0246 TL |
| DeepSeek: DeepSeek V3.2 Speciale | - | 0,024 TL | 0,0721 TL |
| DeepSeek: R1 | - | 0,042 TL | 0,1502 TL |
| DeepSeek: R1 0528 | - | 0,027 TL | 0,1291 TL |
| DeepSeek: R1 Distill Llama 70B | - | 0,042 TL | 0,048 TL |
| DeepSeek: R1 Distill Qwen 32B | - | 0,0174 TL | 0,0174 TL |
| EleutherAI: Llemma 7b | - | 0,048 TL | 0,0721 TL |
| EssentialAI: Rnj 1 Instruct | - | 0,009 TL | 0,009 TL |
| Goliath 120B | - | 0,2252 TL | 0,4505 TL |
| Google: Gemini 2.0 Flash | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.0 Flash Lite | - | 0,0045 TL | 0,018 TL |
| Google: Gemini 2.5 Flash | - | 0,018 TL | 0,1502 TL |
| Google: Gemini 2.5 Flash Lite | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Flash Lite Preview 09-2025 | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Pro | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 05-06 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 06-05 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 3 Flash Preview | - | 0,03 TL | 0,1802 TL |
| Google: Gemini 3.1 Flash Lite Preview | - | 0,015 TL | 0,0901 TL |
| Google: Gemini 3.1 Pro Preview | - | 0,1201 TL | 0,7207 TL |
| Google: Gemini 3.1 Pro Preview Custom Tools | - | 0,1201 TL | 0,7207 TL |
| Google: Gemma 2 27B | - | 0,039 TL | 0,039 TL |
| Google: Gemma 2 9B | - | 0,0018 TL | 0,0054 TL |
| Google: Gemma 3 12B | - | 0,0024 TL | 0,0078 TL |
| Google: Gemma 3 27B | - | 0,0048 TL | 0,0096 TL |
| Google: Gemma 3 4B | - | 0,0024 TL | 0,0048 TL |
| Google: Gemma 3n 4B | - | 0,0012 TL | 0,0024 TL |
| Google: Nano Banana (Gemini 2.5 Flash Image) | - | 0,018 TL | 0,1502 TL |
| Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview) | - | 0,03 TL | 0,1802 TL |
| Google: Nano Banana Pro (Gemini 3 Pro Image Preview) | - | 0,1201 TL | 0,7207 TL |
| IBM: Granite 4.0 Micro | - | 0,001 TL | 0,0066 TL |
| Inception: Mercury | - | 0,015 TL | 0,045 TL |
| Inception: Mercury 2 | - | 0,015 TL | 0,045 TL |
| Inception: Mercury Coder | - | 0,015 TL | 0,045 TL |
| Inflection: Inflection 3 Pi | - | 0,1502 TL | 0,6006 TL |
| Inflection: Inflection 3 Productivity | - | 0,1502 TL | 0,6006 TL |
| Kwaipilot: KAT-Coder-Pro V1 | - | 0,0124 TL | 0,0497 TL |
| Kwaipilot: KAT-Coder-Pro V2 | - | 0,018 TL | 0,0721 TL |
| LiquidAI: LFM2-2.6B | - | 0,0006 TL | 0,0012 TL |
| LiquidAI: LFM2-24B-A2B | - | 0,0018 TL | 0,0072 TL |
| LiquidAI: LFM2-8B-A1B | - | 0,0006 TL | 0,0012 TL |
| Llama Guard 3 8B | - | 0,0012 TL | 0,0036 TL |
| Magnum v4 72B | - | 0,1802 TL | 0,3003 TL |
| Mancer: Weaver (alpha) | - | 0,045 TL | 0,0601 TL |
| Meituan: LongCat Flash Chat | - | 0,012 TL | 0,048 TL |
| Meta: Llama 3 70B Instruct | - | 0,0306 TL | 0,0444 TL |
| Meta: Llama 3 8B Instruct | - | 0,0018 TL | 0,0024 TL |
| Meta: Llama 3.1 70B Instruct | - | 0,024 TL | 0,024 TL |
| Meta: Llama 3.1 8B Instruct | - | 0,0012 TL | 0,003 TL |
| Meta: Llama 3.2 11B Vision Instruct | - | 0,0029 TL | 0,0029 TL |
| Meta: Llama 3.2 1B Instruct | - | 0,0016 TL | 0,012 TL |
| Meta: Llama 3.2 3B Instruct | - | 0,0031 TL | 0,0204 TL |
| Meta: Llama 3.3 70B Instruct | - | 0,006 TL | 0,0192 TL |
| Meta: Llama 4 Maverick | - | 0,009 TL | 0,036 TL |
| Meta: Llama 4 Scout | - | 0,0048 TL | 0,018 TL |
| Meta: Llama Guard 4 12B | - | 0,0108 TL | 0,0108 TL |
| Microsoft: Phi 4 | - | 0,0039 TL | 0,0084 TL |
| MiniMax: MiniMax M1 | - | 0,024 TL | 0,1321 TL |
| MiniMax: MiniMax M2 | - | 0,0153 TL | 0,0601 TL |
| MiniMax: MiniMax M2-her | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax M2.1 | - | 0,0162 TL | 0,0571 TL |
| MiniMax: MiniMax M2.5 | - | 0,0114 TL | 0,0691 TL |
| MiniMax: MiniMax M2.7 | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax-01 | - | 0,012 TL | 0,0661 TL |
| Mistral Large | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2407 | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Codestral 2508 | - | 0,018 TL | 0,0541 TL |
| Mistral: Devstral 2 2512 | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Medium | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Small 1.1 | - | 0,006 TL | 0,018 TL |
| Mistral: Ministral 3 14B 2512 | - | 0,012 TL | 0,012 TL |
| Mistral: Ministral 3 3B 2512 | - | 0,006 TL | 0,006 TL |
| Mistral: Ministral 3 8B 2512 | - | 0,009 TL | 0,009 TL |
| Mistral: Mistral 7B Instruct v0.1 | - | 0,0066 TL | 0,0114 TL |
| Mistral: Mistral Large 3 2512 | - | 0,03 TL | 0,0901 TL |
| Mistral: Mistral Medium 3 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Medium 3.1 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Nemo | - | 0,0012 TL | 0,0024 TL |
| Mistral: Mistral Small 3 | - | 0,003 TL | 0,0048 TL |
| Mistral: Mistral Small 3.1 24B | - | 0,0018 TL | 0,0066 TL |
| Mistral: Mistral Small 3.2 24B | - | 0,0045 TL | 0,012 TL |
| Mistral: Mistral Small 4 | - | 0,009 TL | 0,036 TL |
| Mistral: Mistral Small Creative | - | 0,006 TL | 0,018 TL |
| Mistral: Mixtral 8x22B Instruct | - | 0,1201 TL | 0,3604 TL |
| Mistral: Mixtral 8x7B Instruct | - | 0,0324 TL | 0,0324 TL |
| Mistral: Pixtral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Saba | - | 0,012 TL | 0,036 TL |
| Mistral: Voxtral Small 24B 2507 | - | 0,006 TL | 0,018 TL |
| MoonshotAI: Kimi K2 0711 | - | 0,0342 TL | 0,1381 TL |
| MoonshotAI: Kimi K2 0905 | - | 0,024 TL | 0,1201 TL |
| MoonshotAI: Kimi K2 Thinking | - | 0,0282 TL | 0,1201 TL |
| MoonshotAI: Kimi K2.5 | - | 0,0252 TL | 0,1321 TL |
| Morph: Morph V3 Fast | - | 0,048 TL | 0,0721 TL |
| Morph: Morph V3 Large | - | 0,0541 TL | 0,1141 TL |
| MythoMax 13B | - | 0,0036 TL | 0,0036 TL |
| Nex AGI: DeepSeek V3.1 Nex N1 | - | 0,0081 TL | 0,03 TL |
| Nous: Hermes 3 405B Instruct | - | 0,0601 TL | 0,0601 TL |
| Nous: Hermes 3 70B Instruct | - | 0,018 TL | 0,018 TL |
| Nous: Hermes 4 405B | - | 0,0601 TL | 0,1802 TL |
| Nous: Hermes 4 70B | - | 0,0078 TL | 0,024 TL |
| NousResearch: Hermes 2 Pro - Llama-3 8B | - | 0,0084 TL | 0,0084 TL |
| NVIDIA: Llama 3.1 Nemotron 70B Instruct | - | 0,0721 TL | 0,0721 TL |
| NVIDIA: Llama 3.1 Nemotron Ultra 253B v1 | - | 0,036 TL | 0,1081 TL |
| NVIDIA: Llama 3.3 Nemotron Super 49B V1.5 | - | 0,006 TL | 0,024 TL |
| NVIDIA: Nemotron 3 Nano 30B A3B | - | 0,003 TL | 0,012 TL |
| NVIDIA: Nemotron 3 Super | - | 0,006 TL | 0,03 TL |
| NVIDIA: Nemotron Nano 12B 2 VL | - | 0,012 TL | 0,036 TL |
| NVIDIA: Nemotron Nano 9B V2 | - | 0,0024 TL | 0,0096 TL |
| OpenAI: GPT Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT Audio Mini | - | 0,036 TL | 0,1441 TL |
| OpenAI: GPT-3.5 Turbo | - | 0,03 TL | 0,0901 TL |
| OpenAI: GPT-3.5 Turbo (older v0613) | - | 0,0601 TL | 0,1201 TL |
| OpenAI: GPT-3.5 Turbo 16k | - | 0,1802 TL | 0,2402 TL |
| OpenAI: GPT-3.5 Turbo Instruct | - | 0,0901 TL | 0,1201 TL |
| OpenAI: GPT-4 | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 (older v0314) | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 Turbo | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo (older v1106) | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo Preview | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4.1 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: GPT-4.1 Mini | - | 0,024 TL | 0,0961 TL |
| OpenAI: GPT-4.1 Nano | - | 0,006 TL | 0,024 TL |
| OpenAI: GPT-4o | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-05-13) | - | 0,3003 TL | 0,9009 TL |
| OpenAI: GPT-4o (2024-08-06) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-11-20) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (extended) | - | 0,3604 TL | 1,0811 TL |
| OpenAI: GPT-4o Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o Search Preview | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o-mini | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini (2024-07-18) | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini Search Preview | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-5 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Image | - | 0,6006 TL | 0,6006 TL |
| OpenAI: GPT-5 Image Mini | - | 0,1502 TL | 0,1201 TL |
| OpenAI: GPT-5 Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5 Nano | - | 0,003 TL | 0,024 TL |
| OpenAI: GPT-5 Pro | - | 0,9009 TL | 7,2072 TL |
| OpenAI: GPT-5.1 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Max | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5.2 | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Pro | - | 1,2613 TL | 10,0901 TL |
| OpenAI: GPT-5.2-Codex | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.4 | - | 0,1502 TL | 0,9009 TL |
| OpenAI: GPT-5.4 Mini | - | 0,045 TL | 0,2703 TL |
| OpenAI: GPT-5.4 Nano | - | 0,012 TL | 0,0751 TL |
| OpenAI: GPT-5.4 Pro | - | 1,8018 TL | 10,8108 TL |
| OpenAI: gpt-oss-120b | - | 0,0023 TL | 0,0114 TL |
| OpenAI: gpt-oss-20b | - | 0,0018 TL | 0,0066 TL |
| OpenAI: gpt-oss-safeguard-20b | - | 0,0045 TL | 0,018 TL |
| OpenAI: o1 | - | 0,9009 TL | 3,6036 TL |
| OpenAI: o1-pro | - | 9,009 TL | 36,036 TL |
| OpenAI: o3 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o3 Deep Research | - | 0,6006 TL | 2,4024 TL |
| OpenAI: o3 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Mini High | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Pro | - | 1,2012 TL | 4,8048 TL |
| OpenAI: o4 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o4 Mini Deep Research | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o4 Mini High | - | 0,0661 TL | 0,2643 TL |
| Perplexity: Sonar | - | 0,0601 TL | 0,0601 TL |
| Perplexity: Sonar Deep Research | - | 0,1201 TL | 0,4805 TL |
| Perplexity: Sonar Pro | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Pro Search | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Reasoning Pro | - | 0,1201 TL | 0,4805 TL |
| Prime Intellect: INTELLECT-3 | - | 0,012 TL | 0,0661 TL |
| Qwen: Qwen Plus 0728 | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen Plus 0728 (thinking) | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen VL Max | - | 0,0312 TL | 0,1249 TL |
| Qwen: Qwen VL Plus | - | 0,0082 TL | 0,0246 TL |
| Qwen: Qwen-Max | - | 0,0625 TL | 0,2498 TL |
| Qwen: Qwen-Plus | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen-Turbo | - | 0,002 TL | 0,0078 TL |
| Qwen: Qwen2.5 7B Instruct | - | 0,0024 TL | 0,006 TL |
| Qwen: Qwen2.5 Coder 7B Instruct | - | 0,0018 TL | 0,0054 TL |
| Qwen: Qwen2.5 VL 32B Instruct | - | 0,012 TL | 0,036 TL |
| Qwen: Qwen2.5 VL 72B Instruct | - | 0,048 TL | 0,048 TL |
| Qwen: Qwen3 14B | - | 0,0036 TL | 0,0144 TL |
| Qwen: Qwen3 235B A22B | - | 0,0273 TL | 0,1093 TL |
| Qwen: Qwen3 235B A22B Instruct 2507 | - | 0,0043 TL | 0,006 TL |
| Qwen: Qwen3 235B A22B Thinking 2507 | - | 0,009 TL | 0,0898 TL |
| Qwen: Qwen3 30B A3B | - | 0,0048 TL | 0,0168 TL |
| Qwen: Qwen3 30B A3B Instruct 2507 | - | 0,0054 TL | 0,018 TL |
| Qwen: Qwen3 30B A3B Thinking 2507 | - | 0,0048 TL | 0,024 TL |
| Qwen: Qwen3 32B | - | 0,0048 TL | 0,0144 TL |
| Qwen: Qwen3 8B | - | 0,003 TL | 0,024 TL |
| Qwen: Qwen3 Coder 30B A3B Instruct | - | 0,0042 TL | 0,0162 TL |
| Qwen: Qwen3 Coder 480B A35B | - | 0,0132 TL | 0,0601 TL |
| Qwen: Qwen3 Coder Flash | - | 0,0117 TL | 0,0586 TL |
| Qwen: Qwen3 Coder Next | - | 0,0072 TL | 0,045 TL |
| Qwen: Qwen3 Coder Plus | - | 0,039 TL | 0,1952 TL |
| Qwen: Qwen3 Max | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Max Thinking | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Next 80B A3B Instruct | - | 0,0054 TL | 0,0661 TL |
| Qwen: Qwen3 Next 80B A3B Thinking | - | 0,0059 TL | 0,0468 TL |
| Qwen: Qwen3 VL 235B A22B Instruct | - | 0,012 TL | 0,0529 TL |
| Qwen: Qwen3 VL 235B A22B Thinking | - | 0,0156 TL | 0,1562 TL |
| Qwen: Qwen3 VL 30B A3B Instruct | - | 0,0078 TL | 0,0312 TL |
| Qwen: Qwen3 VL 30B A3B Thinking | - | 0,0078 TL | 0,0937 TL |
| Qwen: Qwen3 VL 32B Instruct | - | 0,0062 TL | 0,025 TL |
| Qwen: Qwen3 VL 8B Instruct | - | 0,0048 TL | 0,03 TL |
| Qwen: Qwen3 VL 8B Thinking | - | 0,007 TL | 0,082 TL |
| Qwen: Qwen3.5 397B A17B | - | 0,0234 TL | 0,1405 TL |
| Qwen: Qwen3.5 Plus 2026-02-15 | - | 0,0156 TL | 0,0937 TL |
| Qwen: Qwen3.5-122B-A10B | - | 0,0156 TL | 0,1249 TL |
| Qwen: Qwen3.5-27B | - | 0,0117 TL | 0,0937 TL |
| Qwen: Qwen3.5-35B-A3B | - | 0,0098 TL | 0,0781 TL |
| Qwen: Qwen3.5-9B | - | 0,003 TL | 0,009 TL |
| Qwen: Qwen3.5-Flash | - | 0,0039 TL | 0,0156 TL |
| Qwen: QwQ 32B | - | 0,009 TL | 0,0348 TL |
| Qwen2.5 72B Instruct | - | 0,0072 TL | 0,0234 TL |
| Qwen2.5 Coder 32B Instruct | - | 0,0396 TL | 0,0601 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| Relace: Relace Apply 3 | - | 0,0511 TL | 0,0751 TL |
| Relace: Relace Search | - | 0,0601 TL | 0,1802 TL |
| ReMM SLERP 13B | - | 0,027 TL | 0,039 TL |
| Sao10K: Llama 3 8B Lunaris | - | 0,0024 TL | 0,003 TL |
| Sao10k: Llama 3 Euryale 70B v2.1 | - | 0,0889 TL | 0,0889 TL |
| Sao10K: Llama 3.1 70B Hanami x1 | - | 0,1802 TL | 0,1802 TL |
| Sao10K: Llama 3.1 Euryale 70B v2.2 | - | 0,0511 TL | 0,0511 TL |
| Sao10K: Llama 3.3 Euryale 70B | - | 0,039 TL | 0,045 TL |
| StepFun: Step 3.5 Flash | - | 0,006 TL | 0,018 TL |
| Switchpoint Router | - | 0,0511 TL | 0,2042 TL |
| Tencent: Hunyuan A13B Instruct | - | 0,0084 TL | 0,0342 TL |
| TheDrummer: Cydonia 24B V4.1 | - | 0,018 TL | 0,03 TL |
| TheDrummer: Rocinante 12B | - | 0,0102 TL | 0,0258 TL |
| TheDrummer: Skyfall 36B V2 | - | 0,033 TL | 0,048 TL |
| TheDrummer: UnslopNemo 12B | - | 0,024 TL | 0,024 TL |
| TNG: DeepSeek R1T2 Chimera | - | 0,018 TL | 0,0661 TL |
| Tongyi DeepResearch 30B A3B | - | 0,0054 TL | 0,027 TL |
| Upstage: Solar Pro 3 | - | 0,009 TL | 0,036 TL |
| WizardLM-2 8x22B | - | 0,0372 TL | 0,0372 TL |
| Writer: Palmyra X5 | - | 0,036 TL | 0,3604 TL |
| xAI: Grok 3 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Beta | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Mini | - | 0,018 TL | 0,03 TL |
| xAI: Grok 3 Mini Beta | - | 0,018 TL | 0,03 TL |
| xAI: Grok 4 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 4 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok 4.20 Beta | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent Beta | - | 0,1201 TL | 0,3604 TL |
| Xiaomi: MiMo-V2-Flash | - | 0,0054 TL | 0,0174 TL |
| Xiaomi: MiMo-V2-Omni | - | 0,024 TL | 0,1201 TL |
| Xiaomi: MiMo-V2-Pro | - | 0,0601 TL | 0,1802 TL |
| Z.ai: GLM 4 32B | - | 0,006 TL | 0,006 TL |
| Z.ai: GLM 4.5 | - | 0,036 TL | 0,1321 TL |
| Z.ai: GLM 4.5 Air | - | 0,0078 TL | 0,0511 TL |
| Z.ai: GLM 4.5V | - | 0,036 TL | 0,1081 TL |
| Z.ai: GLM 4.6 | - | 0,0234 TL | 0,1141 TL |
| Z.ai: GLM 4.6V | - | 0,018 TL | 0,0541 TL |
| Z.ai: GLM 4.7 | - | 0,0234 TL | 0,1051 TL |
| Z.ai: GLM 4.7 Flash | - | 0,0036 TL | 0,024 TL |
| Z.ai: GLM 5 | - | 0,0432 TL | 0,1381 TL |
| Z.ai: GLM 5 Turbo | - | 0,0721 TL | 0,2402 TL |
Yazım Yardımcısı
| Model | Türkçe Kalitesi | Giriş (1K token) | Çıkış (1K token) |
|---|---|---|---|
| OpenAI: GPT-5.3 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.3-Codex | - | 0,1051 TL | 0,8408 TL |
| xAI: Grok 4.1 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok Code Fast 1 | - | 0,012 TL | 0,0901 TL |
| Anthropic: Claude Sonnet 4.6 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4.6 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Haiku 4.5 | - | 0,0601 TL | 0,3003 TL |
| AI21: Jamba Large 1.7 | - | 0,1201 TL | 0,4805 TL |
| AionLabs: Aion-1.0 | - | 0,2402 TL | 0,4805 TL |
| AionLabs: Aion-1.0-Mini | - | 0,042 TL | 0,0841 TL |
| AionLabs: Aion-2.0 | - | 0,048 TL | 0,0961 TL |
| AionLabs: Aion-RP 1.0 (8B) | - | 0,048 TL | 0,0961 TL |
| AlfredPros: CodeLLaMa 7B Instruct Solidity | - | 0,048 TL | 0,0721 TL |
| AllenAI: Olmo 2 32B Instruct | - | 0,003 TL | 0,012 TL |
| AllenAI: Olmo 3 32B Think | - | 0,009 TL | 0,03 TL |
| AllenAI: Olmo 3.1 32B Instruct | - | 0,012 TL | 0,036 TL |
| AllenAI: Olmo 3.1 32B Think | - | 0,009 TL | 0,03 TL |
| Amazon: Nova 2 Lite | - | 0,018 TL | 0,1502 TL |
| Amazon: Nova Lite 1.0 | - | 0,0036 TL | 0,0144 TL |
| Amazon: Nova Micro 1.0 | - | 0,0021 TL | 0,0084 TL |
| Amazon: Nova Premier 1.0 | - | 0,1502 TL | 0,7507 TL |
| Amazon: Nova Pro 1.0 | - | 0,048 TL | 0,1922 TL |
| Anthropic: Claude 3 Haiku | - | 0,015 TL | 0,0751 TL |
| Anthropic: Claude 3.5 Haiku | - | 0,048 TL | 0,2402 TL |
| Anthropic: Claude 3.5 Sonnet | - | 0,3604 TL | 1,8018 TL |
| Anthropic: Claude 3.7 Sonnet | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude 3.7 Sonnet (thinking) | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Opus 4 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.1 | - | 0,9009 TL | 4,5045 TL |
| Anthropic: Claude Opus 4.5 | - | 0,3003 TL | 1,5015 TL |
| Anthropic: Claude Sonnet 4 | - | 0,1802 TL | 0,9009 TL |
| Anthropic: Claude Sonnet 4.5 | - | 0,1802 TL | 0,9009 TL |
| Arcee AI: Coder Large | - | 0,03 TL | 0,048 TL |
| Arcee AI: Maestro Reasoning | - | 0,0541 TL | 0,1982 TL |
| Arcee AI: Spotlight | - | 0,0108 TL | 0,0108 TL |
| Arcee AI: Trinity Mini | - | 0,0027 TL | 0,009 TL |
| Arcee AI: Virtuoso Large | - | 0,045 TL | 0,0721 TL |
| Auto Router | - | -60.060,00 TL | -60.060,00 TL |
| Baidu: ERNIE 4.5 21B A3B | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 21B A3B Thinking | - | 0,0042 TL | 0,0168 TL |
| Baidu: ERNIE 4.5 300B A47B | - | 0,0168 TL | 0,0661 TL |
| Baidu: ERNIE 4.5 VL 28B A3B | - | 0,0084 TL | 0,0336 TL |
| Baidu: ERNIE 4.5 VL 424B A47B | - | 0,0252 TL | 0,0751 TL |
| Body Builder (beta) | - | -60.060,00 TL | -60.060,00 TL |
| ByteDance Seed: Seed 1.6 | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed 1.6 Flash | - | 0,0045 TL | 0,018 TL |
| ByteDance Seed: Seed-2.0-Lite | - | 0,015 TL | 0,1201 TL |
| ByteDance Seed: Seed-2.0-Mini | - | 0,006 TL | 0,024 TL |
| ByteDance: UI-TARS 7B | - | 0,006 TL | 0,012 TL |
| Cohere: Command A | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R (08-2024) | - | 0,009 TL | 0,036 TL |
| Cohere: Command R+ (08-2024) | - | 0,1502 TL | 0,6006 TL |
| Cohere: Command R7B (12-2024) | - | 0,0023 TL | 0,009 TL |
| Deep Cogito: Cogito v2.1 671B | - | 0,0751 TL | 0,0751 TL |
| DeepSeek: DeepSeek V3 | - | 0,0192 TL | 0,0535 TL |
| DeepSeek: DeepSeek V3 0324 | - | 0,012 TL | 0,0462 TL |
| DeepSeek: DeepSeek V3.1 | - | 0,009 TL | 0,045 TL |
| DeepSeek: DeepSeek V3.1 Terminus | - | 0,0126 TL | 0,0474 TL |
| DeepSeek: DeepSeek V3.2 | - | 0,0156 TL | 0,0228 TL |
| DeepSeek: DeepSeek V3.2 Exp | - | 0,0162 TL | 0,0246 TL |
| DeepSeek: DeepSeek V3.2 Speciale | - | 0,024 TL | 0,0721 TL |
| DeepSeek: R1 | - | 0,042 TL | 0,1502 TL |
| DeepSeek: R1 0528 | - | 0,027 TL | 0,1291 TL |
| DeepSeek: R1 Distill Llama 70B | - | 0,042 TL | 0,048 TL |
| DeepSeek: R1 Distill Qwen 32B | - | 0,0174 TL | 0,0174 TL |
| EleutherAI: Llemma 7b | - | 0,048 TL | 0,0721 TL |
| EssentialAI: Rnj 1 Instruct | - | 0,009 TL | 0,009 TL |
| Goliath 120B | - | 0,2252 TL | 0,4505 TL |
| Google: Gemini 2.0 Flash | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.0 Flash Lite | - | 0,0045 TL | 0,018 TL |
| Google: Gemini 2.5 Flash | - | 0,018 TL | 0,1502 TL |
| Google: Gemini 2.5 Flash Lite | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Flash Lite Preview 09-2025 | - | 0,006 TL | 0,024 TL |
| Google: Gemini 2.5 Pro | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 05-06 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 2.5 Pro Preview 06-05 | - | 0,0751 TL | 0,6006 TL |
| Google: Gemini 3 Flash Preview | - | 0,03 TL | 0,1802 TL |
| Google: Gemini 3.1 Flash Lite Preview | - | 0,015 TL | 0,0901 TL |
| Google: Gemini 3.1 Pro Preview | - | 0,1201 TL | 0,7207 TL |
| Google: Gemini 3.1 Pro Preview Custom Tools | - | 0,1201 TL | 0,7207 TL |
| Google: Gemma 2 27B | - | 0,039 TL | 0,039 TL |
| Google: Gemma 2 9B | - | 0,0018 TL | 0,0054 TL |
| Google: Gemma 3 12B | - | 0,0024 TL | 0,0078 TL |
| Google: Gemma 3 27B | - | 0,0048 TL | 0,0096 TL |
| Google: Gemma 3 4B | - | 0,0024 TL | 0,0048 TL |
| Google: Gemma 3n 4B | - | 0,0012 TL | 0,0024 TL |
| Google: Nano Banana (Gemini 2.5 Flash Image) | - | 0,018 TL | 0,1502 TL |
| Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview) | - | 0,03 TL | 0,1802 TL |
| Google: Nano Banana Pro (Gemini 3 Pro Image Preview) | - | 0,1201 TL | 0,7207 TL |
| IBM: Granite 4.0 Micro | - | 0,001 TL | 0,0066 TL |
| Inception: Mercury | - | 0,015 TL | 0,045 TL |
| Inception: Mercury 2 | - | 0,015 TL | 0,045 TL |
| Inception: Mercury Coder | - | 0,015 TL | 0,045 TL |
| Inflection: Inflection 3 Pi | - | 0,1502 TL | 0,6006 TL |
| Inflection: Inflection 3 Productivity | - | 0,1502 TL | 0,6006 TL |
| Kwaipilot: KAT-Coder-Pro V1 | - | 0,0124 TL | 0,0497 TL |
| Kwaipilot: KAT-Coder-Pro V2 | - | 0,018 TL | 0,0721 TL |
| LiquidAI: LFM2-2.6B | - | 0,0006 TL | 0,0012 TL |
| LiquidAI: LFM2-24B-A2B | - | 0,0018 TL | 0,0072 TL |
| LiquidAI: LFM2-8B-A1B | - | 0,0006 TL | 0,0012 TL |
| Llama Guard 3 8B | - | 0,0012 TL | 0,0036 TL |
| Magnum v4 72B | - | 0,1802 TL | 0,3003 TL |
| Mancer: Weaver (alpha) | - | 0,045 TL | 0,0601 TL |
| Meituan: LongCat Flash Chat | - | 0,012 TL | 0,048 TL |
| Meta: Llama 3 70B Instruct | - | 0,0306 TL | 0,0444 TL |
| Meta: Llama 3 8B Instruct | - | 0,0018 TL | 0,0024 TL |
| Meta: Llama 3.1 70B Instruct | - | 0,024 TL | 0,024 TL |
| Meta: Llama 3.1 8B Instruct | - | 0,0012 TL | 0,003 TL |
| Meta: Llama 3.2 11B Vision Instruct | - | 0,0029 TL | 0,0029 TL |
| Meta: Llama 3.2 1B Instruct | - | 0,0016 TL | 0,012 TL |
| Meta: Llama 3.2 3B Instruct | - | 0,0031 TL | 0,0204 TL |
| Meta: Llama 3.3 70B Instruct | - | 0,006 TL | 0,0192 TL |
| Meta: Llama 4 Maverick | - | 0,009 TL | 0,036 TL |
| Meta: Llama 4 Scout | - | 0,0048 TL | 0,018 TL |
| Meta: Llama Guard 4 12B | - | 0,0108 TL | 0,0108 TL |
| Microsoft: Phi 4 | - | 0,0039 TL | 0,0084 TL |
| MiniMax: MiniMax M1 | - | 0,024 TL | 0,1321 TL |
| MiniMax: MiniMax M2 | - | 0,0153 TL | 0,0601 TL |
| MiniMax: MiniMax M2-her | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax M2.1 | - | 0,0162 TL | 0,0571 TL |
| MiniMax: MiniMax M2.5 | - | 0,0114 TL | 0,0691 TL |
| MiniMax: MiniMax M2.7 | - | 0,018 TL | 0,0721 TL |
| MiniMax: MiniMax-01 | - | 0,012 TL | 0,0661 TL |
| Mistral Large | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2407 | - | 0,1201 TL | 0,3604 TL |
| Mistral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Codestral 2508 | - | 0,018 TL | 0,0541 TL |
| Mistral: Devstral 2 2512 | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Medium | - | 0,024 TL | 0,1201 TL |
| Mistral: Devstral Small 1.1 | - | 0,006 TL | 0,018 TL |
| Mistral: Ministral 3 14B 2512 | - | 0,012 TL | 0,012 TL |
| Mistral: Ministral 3 3B 2512 | - | 0,006 TL | 0,006 TL |
| Mistral: Ministral 3 8B 2512 | - | 0,009 TL | 0,009 TL |
| Mistral: Mistral 7B Instruct v0.1 | - | 0,0066 TL | 0,0114 TL |
| Mistral: Mistral Large 3 2512 | - | 0,03 TL | 0,0901 TL |
| Mistral: Mistral Medium 3 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Medium 3.1 | - | 0,024 TL | 0,1201 TL |
| Mistral: Mistral Nemo | - | 0,0012 TL | 0,0024 TL |
| Mistral: Mistral Small 3 | - | 0,003 TL | 0,0048 TL |
| Mistral: Mistral Small 3.1 24B | - | 0,0018 TL | 0,0066 TL |
| Mistral: Mistral Small 3.2 24B | - | 0,0045 TL | 0,012 TL |
| Mistral: Mistral Small 4 | - | 0,009 TL | 0,036 TL |
| Mistral: Mistral Small Creative | - | 0,006 TL | 0,018 TL |
| Mistral: Mixtral 8x22B Instruct | - | 0,1201 TL | 0,3604 TL |
| Mistral: Mixtral 8x7B Instruct | - | 0,0324 TL | 0,0324 TL |
| Mistral: Pixtral Large 2411 | - | 0,1201 TL | 0,3604 TL |
| Mistral: Saba | - | 0,012 TL | 0,036 TL |
| Mistral: Voxtral Small 24B 2507 | - | 0,006 TL | 0,018 TL |
| MoonshotAI: Kimi K2 0711 | - | 0,0342 TL | 0,1381 TL |
| MoonshotAI: Kimi K2 0905 | - | 0,024 TL | 0,1201 TL |
| MoonshotAI: Kimi K2 Thinking | - | 0,0282 TL | 0,1201 TL |
| MoonshotAI: Kimi K2.5 | - | 0,0252 TL | 0,1321 TL |
| Morph: Morph V3 Fast | - | 0,048 TL | 0,0721 TL |
| Morph: Morph V3 Large | - | 0,0541 TL | 0,1141 TL |
| MythoMax 13B | - | 0,0036 TL | 0,0036 TL |
| Nex AGI: DeepSeek V3.1 Nex N1 | - | 0,0081 TL | 0,03 TL |
| Nous: Hermes 3 405B Instruct | - | 0,0601 TL | 0,0601 TL |
| Nous: Hermes 3 70B Instruct | - | 0,018 TL | 0,018 TL |
| Nous: Hermes 4 405B | - | 0,0601 TL | 0,1802 TL |
| Nous: Hermes 4 70B | - | 0,0078 TL | 0,024 TL |
| NousResearch: Hermes 2 Pro - Llama-3 8B | - | 0,0084 TL | 0,0084 TL |
| NVIDIA: Llama 3.1 Nemotron 70B Instruct | - | 0,0721 TL | 0,0721 TL |
| NVIDIA: Llama 3.1 Nemotron Ultra 253B v1 | - | 0,036 TL | 0,1081 TL |
| NVIDIA: Llama 3.3 Nemotron Super 49B V1.5 | - | 0,006 TL | 0,024 TL |
| NVIDIA: Nemotron 3 Nano 30B A3B | - | 0,003 TL | 0,012 TL |
| NVIDIA: Nemotron 3 Super | - | 0,006 TL | 0,03 TL |
| NVIDIA: Nemotron Nano 12B 2 VL | - | 0,012 TL | 0,036 TL |
| NVIDIA: Nemotron Nano 9B V2 | - | 0,0024 TL | 0,0096 TL |
| OpenAI: GPT Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT Audio Mini | - | 0,036 TL | 0,1441 TL |
| OpenAI: GPT-3.5 Turbo | - | 0,03 TL | 0,0901 TL |
| OpenAI: GPT-3.5 Turbo (older v0613) | - | 0,0601 TL | 0,1201 TL |
| OpenAI: GPT-3.5 Turbo 16k | - | 0,1802 TL | 0,2402 TL |
| OpenAI: GPT-3.5 Turbo Instruct | - | 0,0901 TL | 0,1201 TL |
| OpenAI: GPT-4 | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 (older v0314) | - | 1,8018 TL | 3,6036 TL |
| OpenAI: GPT-4 Turbo | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo (older v1106) | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4 Turbo Preview | - | 0,6006 TL | 1,8018 TL |
| OpenAI: GPT-4.1 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: GPT-4.1 Mini | - | 0,024 TL | 0,0961 TL |
| OpenAI: GPT-4.1 Nano | - | 0,006 TL | 0,024 TL |
| OpenAI: GPT-4o | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-05-13) | - | 0,3003 TL | 0,9009 TL |
| OpenAI: GPT-4o (2024-08-06) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (2024-11-20) | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o (extended) | - | 0,3604 TL | 1,0811 TL |
| OpenAI: GPT-4o Audio | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o Search Preview | - | 0,1502 TL | 0,6006 TL |
| OpenAI: GPT-4o-mini | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini (2024-07-18) | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-4o-mini Search Preview | - | 0,009 TL | 0,036 TL |
| OpenAI: GPT-5 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5 Image | - | 0,6006 TL | 0,6006 TL |
| OpenAI: GPT-5 Image Mini | - | 0,1502 TL | 0,1201 TL |
| OpenAI: GPT-5 Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5 Nano | - | 0,003 TL | 0,024 TL |
| OpenAI: GPT-5 Pro | - | 0,9009 TL | 7,2072 TL |
| OpenAI: GPT-5.1 | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1 Chat | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Max | - | 0,0751 TL | 0,6006 TL |
| OpenAI: GPT-5.1-Codex-Mini | - | 0,015 TL | 0,1201 TL |
| OpenAI: GPT-5.2 | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Chat | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.2 Pro | - | 1,2613 TL | 10,0901 TL |
| OpenAI: GPT-5.2-Codex | - | 0,1051 TL | 0,8408 TL |
| OpenAI: GPT-5.4 | - | 0,1502 TL | 0,9009 TL |
| OpenAI: GPT-5.4 Mini | - | 0,045 TL | 0,2703 TL |
| OpenAI: GPT-5.4 Nano | - | 0,012 TL | 0,0751 TL |
| OpenAI: GPT-5.4 Pro | - | 1,8018 TL | 10,8108 TL |
| OpenAI: gpt-oss-120b | - | 0,0023 TL | 0,0114 TL |
| OpenAI: gpt-oss-20b | - | 0,0018 TL | 0,0066 TL |
| OpenAI: gpt-oss-safeguard-20b | - | 0,0045 TL | 0,018 TL |
| OpenAI: o1 | - | 0,9009 TL | 3,6036 TL |
| OpenAI: o1-pro | - | 9,009 TL | 36,036 TL |
| OpenAI: o3 | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o3 Deep Research | - | 0,6006 TL | 2,4024 TL |
| OpenAI: o3 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Mini High | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o3 Pro | - | 1,2012 TL | 4,8048 TL |
| OpenAI: o4 Mini | - | 0,0661 TL | 0,2643 TL |
| OpenAI: o4 Mini Deep Research | - | 0,1201 TL | 0,4805 TL |
| OpenAI: o4 Mini High | - | 0,0661 TL | 0,2643 TL |
| Perplexity: Sonar | - | 0,0601 TL | 0,0601 TL |
| Perplexity: Sonar Deep Research | - | 0,1201 TL | 0,4805 TL |
| Perplexity: Sonar Pro | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Pro Search | - | 0,1802 TL | 0,9009 TL |
| Perplexity: Sonar Reasoning Pro | - | 0,1201 TL | 0,4805 TL |
| Prime Intellect: INTELLECT-3 | - | 0,012 TL | 0,0661 TL |
| Qwen: Qwen Plus 0728 | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen Plus 0728 (thinking) | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen VL Max | - | 0,0312 TL | 0,1249 TL |
| Qwen: Qwen VL Plus | - | 0,0082 TL | 0,0246 TL |
| Qwen: Qwen-Max | - | 0,0625 TL | 0,2498 TL |
| Qwen: Qwen-Plus | - | 0,0156 TL | 0,0468 TL |
| Qwen: Qwen-Turbo | - | 0,002 TL | 0,0078 TL |
| Qwen: Qwen2.5 7B Instruct | - | 0,0024 TL | 0,006 TL |
| Qwen: Qwen2.5 Coder 7B Instruct | - | 0,0018 TL | 0,0054 TL |
| Qwen: Qwen2.5 VL 32B Instruct | - | 0,012 TL | 0,036 TL |
| Qwen: Qwen2.5 VL 72B Instruct | - | 0,048 TL | 0,048 TL |
| Qwen: Qwen3 14B | - | 0,0036 TL | 0,0144 TL |
| Qwen: Qwen3 235B A22B | - | 0,0273 TL | 0,1093 TL |
| Qwen: Qwen3 235B A22B Instruct 2507 | - | 0,0043 TL | 0,006 TL |
| Qwen: Qwen3 235B A22B Thinking 2507 | - | 0,009 TL | 0,0898 TL |
| Qwen: Qwen3 30B A3B | - | 0,0048 TL | 0,0168 TL |
| Qwen: Qwen3 30B A3B Instruct 2507 | - | 0,0054 TL | 0,018 TL |
| Qwen: Qwen3 30B A3B Thinking 2507 | - | 0,0048 TL | 0,024 TL |
| Qwen: Qwen3 32B | - | 0,0048 TL | 0,0144 TL |
| Qwen: Qwen3 8B | - | 0,003 TL | 0,024 TL |
| Qwen: Qwen3 Coder 30B A3B Instruct | - | 0,0042 TL | 0,0162 TL |
| Qwen: Qwen3 Coder 480B A35B | - | 0,0132 TL | 0,0601 TL |
| Qwen: Qwen3 Coder Flash | - | 0,0117 TL | 0,0586 TL |
| Qwen: Qwen3 Coder Next | - | 0,0072 TL | 0,045 TL |
| Qwen: Qwen3 Coder Plus | - | 0,039 TL | 0,1952 TL |
| Qwen: Qwen3 Max | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Max Thinking | - | 0,0468 TL | 0,2342 TL |
| Qwen: Qwen3 Next 80B A3B Instruct | - | 0,0054 TL | 0,0661 TL |
| Qwen: Qwen3 Next 80B A3B Thinking | - | 0,0059 TL | 0,0468 TL |
| Qwen: Qwen3 VL 235B A22B Instruct | - | 0,012 TL | 0,0529 TL |
| Qwen: Qwen3 VL 235B A22B Thinking | - | 0,0156 TL | 0,1562 TL |
| Qwen: Qwen3 VL 30B A3B Instruct | - | 0,0078 TL | 0,0312 TL |
| Qwen: Qwen3 VL 30B A3B Thinking | - | 0,0078 TL | 0,0937 TL |
| Qwen: Qwen3 VL 32B Instruct | - | 0,0062 TL | 0,025 TL |
| Qwen: Qwen3 VL 8B Instruct | - | 0,0048 TL | 0,03 TL |
| Qwen: Qwen3 VL 8B Thinking | - | 0,007 TL | 0,082 TL |
| Qwen: Qwen3.5 397B A17B | - | 0,0234 TL | 0,1405 TL |
| Qwen: Qwen3.5 Plus 2026-02-15 | - | 0,0156 TL | 0,0937 TL |
| Qwen: Qwen3.5-122B-A10B | - | 0,0156 TL | 0,1249 TL |
| Qwen: Qwen3.5-27B | - | 0,0117 TL | 0,0937 TL |
| Qwen: Qwen3.5-35B-A3B | - | 0,0098 TL | 0,0781 TL |
| Qwen: Qwen3.5-9B | - | 0,003 TL | 0,009 TL |
| Qwen: Qwen3.5-Flash | - | 0,0039 TL | 0,0156 TL |
| Qwen: QwQ 32B | - | 0,009 TL | 0,0348 TL |
| Qwen2.5 72B Instruct | - | 0,0072 TL | 0,0234 TL |
| Qwen2.5 Coder 32B Instruct | - | 0,0396 TL | 0,0601 TL |
| Reka Edge | - | 0,006 TL | 0,006 TL |
| Relace: Relace Apply 3 | - | 0,0511 TL | 0,0751 TL |
| Relace: Relace Search | - | 0,0601 TL | 0,1802 TL |
| ReMM SLERP 13B | - | 0,027 TL | 0,039 TL |
| Sao10K: Llama 3 8B Lunaris | - | 0,0024 TL | 0,003 TL |
| Sao10k: Llama 3 Euryale 70B v2.1 | - | 0,0889 TL | 0,0889 TL |
| Sao10K: Llama 3.1 70B Hanami x1 | - | 0,1802 TL | 0,1802 TL |
| Sao10K: Llama 3.1 Euryale 70B v2.2 | - | 0,0511 TL | 0,0511 TL |
| Sao10K: Llama 3.3 Euryale 70B | - | 0,039 TL | 0,045 TL |
| StepFun: Step 3.5 Flash | - | 0,006 TL | 0,018 TL |
| Switchpoint Router | - | 0,0511 TL | 0,2042 TL |
| Tencent: Hunyuan A13B Instruct | - | 0,0084 TL | 0,0342 TL |
| TheDrummer: Cydonia 24B V4.1 | - | 0,018 TL | 0,03 TL |
| TheDrummer: Rocinante 12B | - | 0,0102 TL | 0,0258 TL |
| TheDrummer: Skyfall 36B V2 | - | 0,033 TL | 0,048 TL |
| TheDrummer: UnslopNemo 12B | - | 0,024 TL | 0,024 TL |
| TNG: DeepSeek R1T2 Chimera | - | 0,018 TL | 0,0661 TL |
| Tongyi DeepResearch 30B A3B | - | 0,0054 TL | 0,027 TL |
| Upstage: Solar Pro 3 | - | 0,009 TL | 0,036 TL |
| WizardLM-2 8x22B | - | 0,0372 TL | 0,0372 TL |
| Writer: Palmyra X5 | - | 0,036 TL | 0,3604 TL |
| xAI: Grok 3 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Beta | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 3 Mini | - | 0,018 TL | 0,03 TL |
| xAI: Grok 3 Mini Beta | - | 0,018 TL | 0,03 TL |
| xAI: Grok 4 | - | 0,1802 TL | 0,9009 TL |
| xAI: Grok 4 Fast | - | 0,012 TL | 0,03 TL |
| xAI: Grok 4.20 Beta | - | 0,1201 TL | 0,3604 TL |
| xAI: Grok 4.20 Multi-Agent Beta | - | 0,1201 TL | 0,3604 TL |
| Xiaomi: MiMo-V2-Flash | - | 0,0054 TL | 0,0174 TL |
| Xiaomi: MiMo-V2-Omni | - | 0,024 TL | 0,1201 TL |
| Xiaomi: MiMo-V2-Pro | - | 0,0601 TL | 0,1802 TL |
| Z.ai: GLM 4 32B | - | 0,006 TL | 0,006 TL |
| Z.ai: GLM 4.5 | - | 0,036 TL | 0,1321 TL |
| Z.ai: GLM 4.5 Air | - | 0,0078 TL | 0,0511 TL |
| Z.ai: GLM 4.5V | - | 0,036 TL | 0,1081 TL |
| Z.ai: GLM 4.6 | - | 0,0234 TL | 0,1141 TL |
| Z.ai: GLM 4.6V | - | 0,018 TL | 0,0541 TL |
| Z.ai: GLM 4.7 | - | 0,0234 TL | 0,1051 TL |
| Z.ai: GLM 4.7 Flash | - | 0,0036 TL | 0,024 TL |
| Z.ai: GLM 5 | - | 0,0432 TL | 0,1381 TL |
| Z.ai: GLM 5 Turbo | - | 0,0721 TL | 0,2402 TL |
Fiyatlar günlük TCMB döviz kuruna göre güncellenir. Güncel fiyatlar için Fiyatlandırma sayfasını ziyaret edin.
Bakiye
Hesap bakiyenizi sorgulayın.
/api/v1/balanceGüncel bakiyenizi ve düşük bakiye eşiğini döner.
curl https://tektik.ai/api/v1/balance \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Yanıt Örneği
{
"balanceTl": 1250.50,
"lowBalanceThreshold": 50,
"isLowBalance": false
}Kullanım & İstatistik
API kullanım geçmişinizi ve istatistiklerinizi sorgulayın.
/api/v1/usageDetaylı kullanım loglarını döner.
Sorgu Parametreleri
| Parametre | Tip | Açıklama |
|---|---|---|
| from | string (ISO tarih) | Başlangıç tarihi |
| to | string (ISO tarih) | Bitiş tarihi |
| keyId | number | Belirli bir API anahtarına filtrele |
| limit | number | Sonuç sayısı (varsayılan: 100) |
curl https://tektik.ai/api/v1/usage?from=2026-03-01&limit=50 \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Yanıt Örneği
[
{
"id": 42,
"serviceId": 1,
"modelId": "openai/gpt-4o",
"inputTokens": 1250,
"outputTokens": 340,
"costTl": 0.0892,
"durationMs": 2340,
"createdAt": "2026-03-04T10:30:00Z"
}
]/api/v1/usage/statsKullanım istatistiklerini döner: toplam harcama, günlük dağılım, servis/model/anahtar bazlı kırılımlar.
curl https://tektik.ai/api/v1/usage/stats?from=2026-03-01 \
-H "Authorization: Bearer tkai_YOUR_API_KEY"Yanıt Örneği
{
"totalSpend": 456.78,
"totalRequests": 1250,
"avgCost": 0.37,
"topModel": "openai/gpt-4o",
"dailySpending": [
{ "date": "2026-03-01", "cost": 120.50 },
{ "date": "2026-03-02", "cost": 98.30 }
],
"byService": [
{ "serviceId": 1, "count": 800, "cost": 320.00 }
],
"byModel": [
{ "modelId": "openai/gpt-4o", "count": 600, "cost": 280.00 }
],
"byKey": [
{ "keyId": 1, "keyName": "Prod Key", "count": 1000, "cost": 400.00 }
]
}Hata Kodları
API'den dönebilecek hata kodları ve anlamları.
| Hata Kodu | HTTP | Açıklama |
|---|---|---|
| AUTH_INVALID_KEY | 401 | API anahtarı geçersiz veya süresi dolmuş |
| AUTH_MISSING_KEY | 401 | Authorization header eksik |
| INSUFFICIENT_BALANCE | 402 | Bakiye yetersiz — bakiye yüklemeniz gerekiyor |
| RATE_LIMIT_EXCEEDED | 429 | İstek limiti aşıldı — biraz bekleyin |
| VALIDATION_ERROR | 400 | İstek parametreleri hatalı |
| MODEL_NOT_FOUND | 404 | Belirtilen model bulunamadı |
| MODEL_DEPRECATED | 410 | Model kullanımdan kaldırılmış |
| SERVICE_NOT_FOUND | 404 | Belirtilen servis bulunamadı |
| PROVIDER_ERROR | 502 | AI sağlayıcısından hata alındı |
| INTERNAL_ERROR | 500 | Sunucu hatası — lütfen tekrar deneyin |
Not: Tüm hata yanıtları aşağıdaki formatta döner:
{
"error": {
"code": "ERROR_CODE",
"message": "Türkçe hata açıklaması"
}
}Rate Limiting
API istekleriniz aşağıdaki limitlerle sınırlandırılmıştır.
Kullanıcı Bazlı
60
istek / dakika
IP Bazlı
30
istek / dakika
Rate Limit Bilgileri
- Limit aşıldığında
429 RATE_LIMIT_EXCEEDEDhatası döner. - Sayaçlar her dakika sıfırlanır (120 saniye TTL).
- Kurumsal müşteriler için özel limitler tanımlanabilir.