Skip to main content
Open In ColabOpen on GitHub

AimlapiLLM

This page will help you get started with AI/ML API text completion models. For detailed documentation of all AimlapiLLM features and configurations, head to the API reference.

AI/ML API provides access to 300+ models (Deepseek, Gemini, ChatGPT, etc.) via high-uptime and high-rate API.

Overview

Integration details

ClassPackageLocalSerializableJS supportPackage downloadsPackage latest
AimlapiLLMlangchain-aimlapibetaPyPI - DownloadsPyPI - Version

Model features

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs

Setup

To access AI/ML API models, sign up at aimlapi.com, generate an API key, and set the AIMLAPI_API_KEY environment variable:

import getpass
import os

if "AIMLAPI_API_KEY" not in os.environ:
os.environ["AIMLAPI_API_KEY"] = getpass.getpass("Enter your AI/ML API key: ")

Installation

Install the langchain-aimlapi package:

%pip install -qU langchain-aimlapi
Note: you may need to restart the kernel to use updated packages.

Instantiation

Now we can instantiate the AimlapiLLM model and generate text completions:

from langchain_aimlapi import AimlapiLLM

llm = AimlapiLLM(
model="gpt-3.5-turbo-instruct",
temperature=0.5,
max_tokens=256,
)

Invocation

You can invoke the model with a prompt:

response = llm.invoke("Explain the bubble sort algorithm in Python.")
print(response)


Bubble sort is a simple sorting algorithm that repeatedly steps through the list to be sorted, compares each pair of adjacent items and swaps them if they are in the wrong order. This process is repeated until the entire list is sorted.

The algorithm gets its name from the way smaller elements "bubble" to the top of the list. It is commonly used for educational purposes due to its simplicity, but it is not a very efficient sorting algorithm for large data sets.

Here is an implementation of the bubble sort algorithm in Python:

1. Start by defining a function that takes in a list as its argument.
2. Set a variable "swapped" to True, indicating that a swap has occurred.
3. Create a while loop that runs as long as the "swapped" variable is True.
4. Inside the loop, set the "swapped" variable to False.
5. Create a for loop that iterates through the list, starting from the first element and ending at the second to last element.
6. Inside the for loop, compare the current element with the next element. If the current element is larger than the next element, swap them and set the "swapped" variable to True.
7. After the for loop, if the "swapped" variable

Streaming Invocation

You can also stream responses token-by-token:

llm = AimlapiLLM(
model="gpt-3.5-turbo-instruct",
)

for chunk in llm.stream("List top 5 programming languages in 2025 with reasons."):
print(chunk, end="", flush=True)
 

1. Python
Python has been consistently growing in popularity and has become one of the most widely used programming languages in recent years. It is used for a wide range of applications such as web development, data analysis, machine learning, and artificial intelligence. Its simple syntax and readability make it an attractive choice for beginners and experienced programmers alike. With the rise of data-driven technology and automation, Python is projected to be the most in-demand language in 2025.

2. JavaScript
JavaScript continues to dominate the web development scene and is expected to maintain its position as a top programming language in 2025. With the increasing use of front-end frameworks like React and Angular, JavaScript is crucial for building dynamic and interactive user interfaces. Additionally, the rise of serverless architecture and the popularity of Node.js make JavaScript an essential language for both front-end and back-end development.

3. Go
Go, also known as Golang, is a relatively new programming language developed by Google. It is designed for

API reference

For detailed documentation of all AimlapiLLM features and configurations, visit the API Reference.

Chaining

You can also easily combine with a prompt template for easy structuring of user input. We can do this using LCEL

from langchain_core.prompts import PromptTemplate

prompt = PromptTemplate.from_template("Tell me a joke about {topic}")
chain = prompt | llm
API Reference:PromptTemplate
chain.invoke({"topic": "bears"})
"\n\nWhy do bears have fur coats?\n\nBecause they'd look silly in sweaters! "

API reference

For detailed documentation of all AI/ML API llm features and configurations head to the API reference: API Reference