In-app integration
If you are a developer you can use our FAQ API to embed Alterra Answers in your mobile applications.
Currently, the API is cloud-based. However, it has a small footprint that can even run on device, both training ML and inference, w/o cloud connection. It runs on CPU – no GPU/TPU is needed – and is 1,000x faster than a typical RNN, and it takes only ~10MB of memory. Native iOS and Android ports are available.