Hey self-hosted community 👋

My friend and I have been hacking on SecureAI Tools — an open-source AI tools platform for everyone’s productivity. And we have our very first release 🎉

Here is a quick demo: https://youtu.be/v4vqd2nKYj0

Get started: https://github.com/SecureAI-Tools/SecureAI-Tools#install

Highlights:

  • Local inference: Runs AI models locally. Supports 100+ open-source (and semi open-source) AI models.
  • Built-in authentication: A simple email/password authentication so it can be opened to the internet and accessed from anywhere.
  • Built-in user management: So family members or coworkers can use it as well if desired.
  • Self-hosting optimized A simple we A simple email/password authentication so it can be opened to the internet and accessed from anywhere.
  • Lightweight: A simple web app with SQLite DB to avoid having to run additional DB docker. Data is persisted on the host machine through docker volumes

In the future, we are looking to add support for more AI tools like chat-with-documents, discord bot, and many more. Please let us know if you have any specific ones that you’d like us to build, and we will be happy to add them to our to-do list.

Please give it a go and let us know what you think. We’d love to get your feedback. Feel free to contribute to this project, if you’d like – we welcome contributions :)

We also have a small discord community at https://discord.gg/YTyPGHcYP9 so consider joining it if you’d like to follow along

2 points

Hardware requirements:

  • RAM: As much as the AI model requires. Most models have a variant that works well on 8 GB RAM
  • GPU: GPU is recommended but not required. It also runs in CPU-only mode but will be slower on Linux, Windows, and Mac-Intel. On M1/M2/M3 Macs, the inference speed is really good.

(For some reason, my response to original comment isn’t showing up so reposting here)

permalink
report
reply
1 point

How does it get it’s training data? Would this work offline?

permalink
report
reply
1 point

Nicely done! What are options like for AMD GPUs? Any future plans to support?

permalink
report
reply
1 point

This looks awesome! My little project was missing just that! Https://github.com/rogueghost93/fly-hi I’ll add it these days!

permalink
report
reply
1 point

I can’t get it running with my GPU.

I get this error:

parsing /root/secure-ai-tools/docker-compose.yml: yaml: line 19: did not find expected key

This is my .yaml:

services:

web: image: public.ecr.aws/d8f2p0h3/secure-ai-tools:latest platform: linux/amd64 volumes: - ./web:/app/volume env_file: - .env environment: - INFERENCE_SERVER=http://inference:11434/ ports: - 28669:28669 command: sh -c “cd /app && sh tools/db-migrate-and-seed.sh ${DATABASE_FILE} && node server.js” depends_on: - inference

inference: image: ollama/ollama:latest volumes: - ./inference:/root/.ollama deploy: resources: reservations: devices: - driver: nvidia count: ‘all’ capabilities: [gpu]

permalink
report
reply

Self-Hosted Main

!main@selfhosted.forum

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

For Example

  • Service: Dropbox - Alternative: Nextcloud
  • Service: Google Reader - Alternative: Tiny Tiny RSS
  • Service: Blogger - Alternative: WordPress

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

Community stats

  • 14

    Monthly active users

  • 1.8K

    Posts

  • 11K

    Comments

Community moderators