SkywardAI is an open-source community dedicated to developing efficient and specialized AI models. Founded by passionate students from RMIT University Melbourne, our mission is to create high-performing AI solutions that prioritize accessibility, efficiency, and sustainability. We empower users to run advanced AI models on consumer-grade hardware while supporting cloud-native deployment.
The community is inspired by an open-source OpenAI alternative API LocalAI.
Our AI open-source community aims to develop specialized, high-performing models that are not only smaller but also optimized for specific tasks. Unlike massive LLMs, these models prioritize efficiency and accessibility while maintaining top-tier performance. By minimizing data requirements and reducing computational demands, we can create a more sustainable AI ecosystem—maximizing intelligence per watt. This approach ensures that AI remains both powerful and practical, making advanced technology more accessible for widespread adoption.
![Screenshot 2024-08-01 at 10 27 05 PM](https://private-user-images.githubusercontent.com/8053949/354221448-4cb00005-0b3d-4d82-99cf-7876aa8e9a7d.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk0NDAwMzcsIm5iZiI6MTczOTQzOTczNywicGF0aCI6Ii84MDUzOTQ5LzM1NDIyMTQ0OC00Y2IwMDAwNS0wYjNkLTRkODItOTljZi03ODc2YWE4ZTlhN2QucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDIxMyUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTAyMTNUMDk0MjE3WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9M2QxOGI0ZTE3MTQwZGEyMTEwN2ZkMjVlZGNkMjA3MmQ0MmI1ODg4YmQyYThlYTBhNjNiMTJhZmNkZGE0Yjk3MCZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.orAphGuFMELuwMrcvdqD29ZA5DgJLjJ-2xY4zmoKtHU)
Our mission is to make AI both powerful and practical by developing efficient, accessible, and sustainable models.
We aim to deliver:
- Highly efficient, specialized AI models optimized for specific tasks
- Smooth streaming inference on CPUs with open-source multimodal models under 10B parameters
- Inference with RAG mode for enhanced retrieval-augmented generation
- A simple trainer for customizing neural net training
- One-command deployment for seamless integration
Our vision is to create a thriving community that drives the future of AI with sustainability and efficiency at its core.
- AI should be both accessible and efficient—maximizing intelligence per watt
- Everyone can be a neural network hero—today we define functions, tomorrow we define neural networks
- AI isn't just a trend; it should be a practical, deployable solution on consumer-grade hardware
- By reducing data requirements and computational overhead, we ensure AI remains sustainable and widely available