DubAPI Bot

A Node.js API for creating queup.net bots, containerized with Docker for 24/7 operation on Hugging Face Spaces.

Features

  • Queup.net bot functionality
  • Dockerized for consistent deployment
  • 24/7 operation on Hugging Face Spaces
  • Easy configuration through environment variables

Installation

npm install dubapi

Optionally, the websocket implementation can make use of native addons for performance and spec compliance.

npm install --save-optional bufferutil utf-8-validate

Usage

var DubAPI = require('dubapi');

new DubAPI({username: '', password: ''}, function(err, bot) {
    if (err) return console.error(err);

    console.log('Running DubAPI v' + bot.version);

    function connect() {bot.connect('friendship-is-magic');}

    bot.on('connected', function(name) {
        console.log('Connected to ' + name);
    });

    bot.on('disconnected', function(name) {
        console.log('Disconnected from ' + name);
        setTimeout(connect, 15000);
    });

    bot.on('error', function(err) {
        console.error(err);
    });

    bot.on(bot.events.chatMessage, function(data) {
        console.log(data.user.username + ': ' + data.message);
    });

    connect();
});

Docker Deployment

The bot is containerized using Docker for easy deployment and 24/7 operation on Hugging Face Spaces.

Local Development

  1. Build and run using Docker Compose:
docker-compose up --build
  1. Stop the container:
docker-compose down

Hugging Face Spaces Deployment

  1. Create a new Space on Hugging Face Spaces
  2. Choose "Docker" as the SDK
  3. Upload the following files to your Space:
    • Dockerfile
    • .dockerignore
    • package.json
    • All your source files
  4. The container will automatically build and deploy on Hugging Face Spaces

Environment Variables

Configure these in your Hugging Face Space settings:

  • NODE_ENV: Set to "production"
  • Add any other required environment variables for your bot

Credit

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support