Diana Services for Neon AI

DIANA SERVICES

Device Independent API for Neon AI Applications (DIANA)

Diana provides a collection of services for automated setup and common administration tasks for business and enterprise installation of Neon AI applications. Users can choose to not run all the services. Notable uses of Diana services include:

  • Diana Server TTS and Translation Services 
  • Docker Support
  • Kubernetes Support — Cluster Portion Support
  • Kubernetes Cluster to Cert Package — Domain SSL Certification 
  • Backend Services — Credentials — Cluster Management — CM Secrets — Docker Compose
  • Ingress Support — for Cluster Portion Support 
  • Manual Configuration for Docker  
  • Diana Utilities Module — Automated Certbot and RabbitMQ Configuration, HTTPS Services, Web Socket Services, Port IO Services, and Message Bus 

Docker Compose

“Docker Compose” offers a simple method for starting a set of Diana services. All Diana services are standalone containers connected to the Neon AI RabbitMQ server. 

Diana is the starting point for Neon AI backend components and tools, many of which are user configurable. Diana generates resources for both Docker and Kubernetes. The following instructions are provided to download Neon AI Docker Containers, Diana Services and the Diana Backend Server. 

Diana commands include:

  • Automatically Configure a Diana Backend
  • Configure Optional HTTP Backend Services
  • Generate Kubernetes Secrets Authorization Variables
  • Generate a Kubernetes ConfigMap for RabbitMQ
  • Start / Stop a Diana Backend

Kubernetes

For deployment, Kubernetes provides a managed solution that can handle containerization, scaling, rolling updates and more benefits. Diana deployment works with your existing clusters, managed by your system administrator using the NGINX Ingress Controller. For deploying locally, you can use MetalLB to configure a LoadBalancer.

A Diana backend can be configured automatically with the Diana configure-backend command. This will generate k8s_secret_mq-config.yml, k8s_config_rabbitmq.yml, and kubernetes.yml spec files, including creating an instance of RabbitMQ via Docker.  

Resources for Kubernetes and OpenShift can be generated for deploying containers under infrastructures using Docker Compose.

On-Device Speech To Text (STT) and Text To Speech (TTS)

Full on-device polylingual speech-to-text (STT) and text-to-speech (TTS) are provided for:

  • Linux
  • Raspberry Pi
  • Mycroft Mark 2

In addition to English, foreign language support provided for:

  • French
  • Spanish
  • German
  • Italian
  • Russian
  • Polish

… and a maturing list of other languages

 

GitHub

The GitHub link for the latest default branch is: 

Support

If your organization is standardizing on Neon AI, the Neon AI support team is available to help with your installation of Neon AI and Diana, and assist your IT team in using Docker Containers and Kubernetes Orchestration. 

For corporate support, please email josh@neon.ai and mention a brief reference to your application – and we’ll get right back to you about helping you out. We’re happy to reach out through email, GitHub or setting up a time to chat on-line, as you prefer. 

– Josh R <josh@neon.ai>