AuraNexus - AI Content Orchestration Platform

AuraNexus is an open-source, self-hosted platform that automates the entire content lifecycle: Ideation β†’ Creation β†’ Optimization β†’ Multimedia Production β†’ Publishing β†’ Analytics.

Features

  • AI-Powered Content Generation: Generate text, images, and multimedia content using open-source models
  • Multi-Platform Distribution: Publish to Twitter, Instagram, LinkedIn, YouTube, and more
  • Smart Optimization: Automatically optimize content for different platforms and audiences
  • Analytics Dashboard: Track performance and get AI-powered insights
  • Privacy-First: Self-hosted solution with complete control over your data

Architecture

The platform consists of five core modules:

  1. AI Gateway: Unified interface for LLMs and diffusion models
  2. Content Studio: Context-aware editor with auto-generation capabilities
  3. Multimedia Factory: Pipeline for text-to-image, text-to-video generation
  4. Omni-Channel Distributor: Adapters for various social platforms
  5. Insight Dashboard: Analytics and recommendation engine

Quick Start

Prerequisites

  • Python 3.8+
  • Docker and Docker Compose (optional, for containerized deployment)

Running the Platform

  1. Clone the repository:

    git clone https://huggingface.co/[YOUR_USERNAME]/auranexus
    cd auranexus
    
  2. Start the platform:

    ./startup.sh
    
  3. Access the platform:

  4. Create an account using the registration form

  5. Start generating content using the content creation interface

Using Docker (Alternative Method)

  1. Build and start services:

    docker-compose up --build
    
  2. Access the platform:

API Endpoints

  • POST /api/v1/auth/register - Register a new user
  • POST /api/v1/auth/token - Get authentication token
  • POST /api/v1/content/generate - Generate new content
  • POST /api/v1/content/optimize - Optimize existing content
  • GET /api/v1/content/my - Get your content items
  • GET /api/v1/models - List available AI models

Development

Project Structure

auranexus/
β”œβ”€β”€ api/                 # FastAPI application
β”œβ”€β”€ database/            # Database models and session management
β”œβ”€β”€ core/               # Core utilities (AI gateway, security)
β”œβ”€β”€ services/           # Business logic services
β”œβ”€β”€ utils/              # Utility functions
β”œβ”€β”€ frontend/           # Frontend files
β”œβ”€β”€ main.py             # Main application entry point
β”œβ”€β”€ requirements.txt    # Python dependencies
└── docker-compose.yml  # Docker configuration

Extending the Platform

  1. Add new AI models: Extend the AIGateway class in core/ai_gateway.py
  2. Add platform adapters: Create new classes in services/ for additional social platforms
  3. Enhance content types: Modify the content studio service to support new content types

Production Deployment

For production deployment:

  1. Set environment variables:

    export SECRET_KEY="your-super-secret-key-change-in-production"
    export DATABASE_URL="postgresql://user:password@host:port/dbname"
    
  2. Use the provided Docker Compose configuration

  3. Set up SSL certificates for HTTPS

  4. Configure a reverse proxy (nginx recommended)

Contributing

We welcome contributions! Please see our contributing guidelines for more information.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For support, please open an issue in this repository or join our community forums.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support