Skip to content

AgentMake AI: a kit for developing agentic AI applications that support 14 AI backends and and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong) Supported backends: anthropic, azure, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai

License

Notifications You must be signed in to change notification settings

eliranwong/agentmake

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AgentMake AI

AgentMake AI: a software developement kit for developing agentic AI applications that support 14 AI backends and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong)

Supported backends: anthropic, azure, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai

Audio Introduction

Watch the video

9-min introduction 24-min introduction

Sibling Projects

This SDK incorporates the best aspects of our favorite projects, LetMeDoIt AI, Toolmate AI and TeamGen AI, to create a library aimed at further advancing the development of agentic AI applications.

Supported Platforms

Windows, macOS, Linux, ChromeOS, Android via Termux

Supported backends

anthropic - Anthropic API [docs]

azure - Azure OpenAI API [docs]

cohere - Cohere API [docs]

custom - any openai-compatible backends that support function calling

deepseek - DeepSeek API [docs]

genai - Vertex AI or Google AI [docs]

github - Github API [docs]

googleai - Google AI [docs]

groq - Groq Cloud API [docs]

llamacpp - Llama.cpp Server [docs] - local setup required

mistral - Mistral API [docs]

ollama - Ollama [docs] - local setup required

openai - OpenAI API [docs]

vertexai - Vertex AI [docs]

xai - XAI API [docs]

For simplicity, agentmake uses ollama as the default backend, if parameter backend is not specified. Ollama models are automatically downloaded if they have not already been downloaded. Users can change the default backend by modifying environment variable DEFAULT_AI_BACKEND.

Introducing Agentic Components

agentmake is designed to work with seven kinds of components for building agentic applications:

  1. system - System messages are crucial for defining the roles of the AI agents and guiding how AI agents interact with users. Check out our examples. agentmake supports the use of fabric patterns as system components for running agentmake function or CLI options READ HERE.

  2. instruction - Predefined instructions that are added to users' prompts as prefixes, before they are passed to the AI models. Check out our examples. agentmake supports the use of fabric patterns as instruction components for running agentmake function or CLI options READ HERE.

  3. input_content_plugin - Input content plugins process or transform user inputs before they are passed to the AI models. Check out our examples.

  4. output_content_plugin - Output content plugins process or transform assistant responses after they are generated by AI models. Check out our examples.

  5. tool - Tools take simple structured actions in response to users' requests, with the use of schema and function calling. Check out our examples.

  6. agent - Agents are agentic applications automate multiple-step actions or decisions, to fulfill complicated requests. They can be executed on their own or integrated into an agentic workflow, supported by agentmake, to work collaboratively with other agents or components. Check out our examples.

  7. follow_up_prompt - Predefined prompts that are helpful for automating a series of follow-up responses after the first assistant response is generated. Check out our examples.

Built-in and Custom Agentic Components

agentmake supports both built-in agentic components, created by our developers or contributors, and cutoms agentic components, created by users to meet their own needs.

Built-in Agentic Components

Built-in agents components are placed into the following six folders inside the agentmake folders:

agents, instructions, plugins, prompts, systems, tools

To use the built-in components, you only need to specify the component filenames, without parent paths or file extensions, when you run the agentmake signature function or CLI options.

Custom Agentic Components

agentmake offers two options for users to use their custom components.

Option 1: Specify the full file path of inidividual components

Given the fact that each component can be organised as a single file, to use their own custom components, users only need to specify the file paths of the components they want to use, when they run the agentmake signature function or CLI options.

Option 2: Place custom components into agentmake user directory

The default agentmake user directory is ~/agentmake, i.e. a folder named agentmake, created under user's home directory. Uses may define their own path by modifying the environment variable AGENTMAKE_USER_DIR.

After creating a folder named agentmake under user directory, create six sub-folders in it, according to the following names and place your custom components in relevant folders, as we do with our built-in components.

If you organize the custom agentic components in this way, you only need to specify the component filenames, without parent paths or file extensions, when you run the agentmake signature function or CLI options.

Priorities

In cases where a built-in tool and a custom tool have the same name, the custom tool takes priority over the built-in one. This allows for flexibility, enabling users to copy a built-in tool, modify its content, and retain the same name, thereby effectively overriding the built-in tool.

Installation

Basic:

pip install --upgrade agentmake

Basic installation supports all AI backends mentioned above, except for vertexai.

Extras:

We support Vertex AI via Google GenAI SDK. As this package supports most platforms, except for Android Termux, we separate this package google-genai as an extra. To support Vertex AI with agentmake, install with running:

pip install --upgrade "agentmake[genai]"

Remarks

It is recommended not to install agentmake inside the directory ~/agentmake, as ~/agentmake is used by default for placing user custom content.

Usage

This SDK is designed to offer a single signature function agentmake for interacting with all AI backends, delivering a unified experience for generating AI responses. The main APIs are provided with the function agentmake located in this file.

Find documentation at https://github.com/eliranwong/agentmake/blob/main/docs/README.md

Examples

The following examples assumes Ollama is installed as the default backend.

To import:

from agentmake import agentmake

To run, e.g.:

agentmake("What is AI?")

To work with parameter tool, e.g.:

agentmake("What is AgentMake AI?", tool="search/google")

agentmake("How many 'r's are there in the word 'strawberry'?", tool="magic")

agentmake("What time is it right now?", tool="magic")

agentmake("Open github.com in a web browser.", tool="magic")

agentmake("Convert file 'music.wav' into mp3 format.", tool="magic")

agentmake("Send an email to Eliran Wong at eliran.wong@domain.com to express my gratitude for his work.", tool="email/gmail")

A cross-platform solution to work with a tool that is placed in a sub-folder, e.g.:

agentmake("Extract text from image file 'sample.png'.", tool=os.path.join("perplexica", "github"))

To work with parameters input_content_plugin and output_content_plugin, e.g.:

agentmake("what AI model best", input_content_plugin="improve_writing", output_content_plugin="translate_into_chinese", stream=True)

To work with plugin that is placed in a sub-folder, e.g.:

agentmake("你好吗?", output_content_plugin=os.path.join("chinese", "convert_simplified"))

To automate prompt engineering:

agentmake("what best LLM training method", system="auto", input_content_plugin="improve_prompt")

To work with parameter system, instruction, follow_up_prompt, e.g.:

agentmake("Is it better to drink wine in the morning, afternoon, or evening?", instruction="reflect", stream=True)

agentmake("Is it better to drink wine in the morning, afternoon, or evening?", instruction="think", follow_up_prompt=["review", "refine"], stream=True)

agentmake("Provide a detailed introduction to generative AI.", system=["create_agents", "assign_agents"], follow_up_prompt="Who is the best agent to contribute next?", stream=True, model="llama3.3:70b")

To work with parameter agent, e.g.:

agentmake("Write detailed comments about the works of William Shakespeare, focusing on his literary contributions, dramatic techniques, and the profound impact he has had on the world of literature and theatre.", agent="teamwork", stream=True, model="llama3.3:70b")

To specify an AI backend:

agentmake("What is Microsoft stock price today?", tool=os.path.join("search", "finance"), backend="azure")

To work collaboratively with different backends, e.g.

messages = agentmake("What is the most effective method for training AI models?", backend="openai")
messages = agentmake(messages, backend="googleai", follow_up_prompt="Can you give me some different options?")
messages = agentmake(messages, backend="xai", follow_up_prompt="What are the limitations or potential biases in this information?")
agentmake(messages, backend="mistral", follow_up_prompt="Please provide a summary of the discussion so far.")

As you may see, the agentmake function returns the messages list, which is passed to the next agentmake function in turns.

Therefore, it is very simple to create a chatbot application, you can do it as few as five lines or less, e.g.:

messages = [{"role": "system", "content": "You are an AI assistant."}]
user_input = "Hello!"
while user_input:
    messages = agentmake(messages, follow_up_prompt=user_input, stream=True)
    user_input = input("Enter your query:\n(enter a blank entry to exit)\n>>> ")

You may take a look at out our built-in components for more ideas:

systems

instructions

plugins

tools.

agents.

prompts.

CLI Options

Command CLI are designed for quick run of AI features.

Check for CLI options, run:

agentmark -h

Two shortcut commands:

ai == agentmake

aic == agentmake -c with chat features enabled

The available CLI options use the same parameter names as the agentmake function for AI backend configurations, to offer users a unified experience. Below are some CLI examples, that are equivalent to some of the examples mentioned above:

ai What is AI?

ai What is AgentMake AI --tool search/google

ai Convert file music.wav into mp3 format. --tool task

ai Send an email to Eliran Wong at eliran.wong@domain.com to express my gratitude for his work --tool email/gmail

ai Extract text from image file sample.png. --tool=ocr/openai

ai What is Microsoft stock price today? -t search/finance -b azure

ai what AI model best --input_content_plugin improve_writing --output_content_plugin translate_into_chinese

ai what best LLM training method --system auto --input_content_plugin improve_prompt

ai 你好吗? --output_content_plugin=chinese/convert_simplified

ai Is it better to drink wine in the morning, afternoon, or evening? --instruction think --follow_up_prompt review --follow_up_prompt refine

ai Write detailed comments about the works of William Shakespeare, focusing on his literary contributions, dramatic techniques, and the profound impact he has had on the world of literature and theatre --agent teamwork --model "llama3.3:70b"

CLI for Testing

CLI options are handy for testing, e.g. simply use a newly developed tool file with -t option and run:

ai What is AgentMake AI? -t ~/my_folder/perplexica.py

AI Backends Configurations

For quick start, run:

agentmake -ec

For more options:

To use ollama as the default backend, you need to download and install Ollama. To use backends other than Ollama, you need to use your own API keys. There are a few options you may configure the AI backends to work with agentmake.

Option 1 - Use the agentmake function

Specify AI backend configurations as parameters when you run the agentmake signature function agentmake.

Setting configurations via option 1 overrides the default configurations set by option 2 and option 3, but the overriding is effective only when you run the function, with the specified configurations. Default configurations described below in option 2 and 3 still apply next time when you run the agentmake function, without specifying the AI backend parameters. This gives you flexibility to specify different settings in addition to the default ones.

Option 2 - Export individual environment variables

You may manually export individual environment variables listed in https://github.com/eliranwong/agentmake/blob/main/agentmake.env

Option 3 - Export default environment variables once for all

  1. Make a copy of the file agentmake.env, located in the package directory, as:

either ~/agentmake/agentmake.env

cd agentmake # where you installed agentmake
cp agentmake.env ~/agentmake/agentmake.env

or <package_directory>/.env:

cd agentmake # where you installed agentmake
cp agentmake.env .env
  1. Edit the file manually with a text editor, e.g.

etextedit ~/agentmake/agentmake.env

  1. Save the changes

The changes apply next time when you run agentmake function or cli.

Option 4 - Run built-in CLI option

Use built-in agentmake cli option to edit the variables:

agentmake -ec

What does this command do?

  • It automatically makes a copy of file agentmake.env and save it as <package_directory>/.env if both <package_directory>/.env and ~/agentmake/agentmake.env do not exist.
  • It uses the text editor, specified in DEFAULT_TEXT_EDITOR, to open the configuration file ~/agentmake/agentmake.env if it exists or <package_directory>/.env if ~/agentmake/agentmake.env does not exist.

Remember to save your changes to make them effective.

Note about Vertex AI Setup

Make sure the extra package genai is installed with the command mentioned above:

pip install --upgrade "agentmake[genai]"

To configure, run:

ai -ec

Enter the path of your Google application credentials JSON file as the value of VERTEXAI_API_KEY. You need to specify your project ID and service location, in the configurations, as well. e.g.:

VERTEXAI_API_KEY=~/agentmake/google_application_credentials.json
VERTEXAI_API_PROJECT_ID=my_project_id
VERTEXAI_API_SERVICE_LOCATION=us-central1

To test Gemini 2.0 with Vertex AI, e.g.:

ai -b vertexai -m gemini-2.0-flash Hi!

Remarks

  1. Please do not edit the file agentmake.env, that is located in the package directory, directly, as it is restored to its default values upon each upgrade. It is recommended to make a copy of it and edit the copied file.
  2. Multiple API keys are supported for running backends cohere, github, groq and mistral. You may configure API keys for these backend in the .env file by using commas , as separators, e.g. COHERE_API_KEY=cohere_api_key_1,cohere_api_key_2,cohere_api_key_3

Fabric Integration

fabric is a fantastic third-party project that offers a great collection of patterns.

agentmake supports the use of fabric patterns as entries for the system or instruction parameters when running the agentmake signature function or CLI options.

To use a fabric pattern in agentmake:

  1. Install fabric
  2. Specify a fabric pattern in agentmake parameter system or instruction, by prefixing the selected pattern with fabric.

agentmake("The United Kingdom is a Christian country.", tool="search/searxng", system="fabric.analyze_claims")

Local Backends with GPU Acceleration

Both local backends ollama and llamacpp support GPU accelerations.

TODO

  • add more examples
  • convert availble ToolMate AI tools into tools that runable with this SDK (... in progess ...)
  • add documentation about tool creation

About

AgentMake AI: a kit for developing agentic AI applications that support 14 AI backends and and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong) Supported backends: anthropic, azure, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages