diff --git a/README.md b/README.md
index 549d4d8..806e92e 100644
--- a/README.md
+++ b/README.md
@@ -3,14 +3,14 @@
[](https://opensource.org/licenses/MIT)
-[](https://www.python.org/downloads/)
[](https://pytorch.org/)
[](https://github.com/PolymathicAI/AION/actions/workflows/test.yaml)
[](https://colab.research.google.com/github/PolymathicAI/AION/blob/main/notebooks/Tutorial.ipynb)
+[](https://huggingface.co/polymathic-ai/aion-base)
**Polymathic's Large Omnimodal Model for Astronomy**
-[🚀 Quick Start](#-quick-start) • [🔬 Scientific Overview](#-scientific-overview) • [📦 Advanced Installation](#-advanced-installation)
+[🚀 Quick Start](#-quick-start) • [🎓 Tutorials](#-tutorials) • [🔬 Scientific Overview](#-scientific-overview) • [📦 Advanced Installation](#-advanced-installation)
@@ -24,21 +24,6 @@
AION-1 is a cutting-edge large omnimodal model specifically designed for astronomical surveys. It seamlessly integrates multiple data modalities, and enables simple adaptation to a wide range of astronomical tasks.
-## Alpha Testing
-
-AION-1 model weights are hosted on Huggingface behind gates during the alpha testers phase. First, ensure that you have access to the Hugginface model weights. If you don't have access, you can request it directly on the [hugginface repo here](https://huggingface.co/polymathic-ai/aion-base).
-
-Once you have access, you will need to set up a huggingface token locally. This can be done by first installing hugginface_hub:
-```bash
-pip install huggingface_hub
-```
-
-and then logging in via
-```bash
-huggingface-cli login --token YOUR_HF_TOKEN
-```
-All of the ensuing steps should work out of the box after this point.
-
## 🚀 Quick Start
Assuming you have PyTorch installed, you can install AION trivially with:
@@ -82,7 +67,6 @@ preds = model(
Start with our interactive tutorial:
- **[Open in Google Colab](https://colab.research.google.com/github/PolymathicAI/AION/blob/main/notebooks/Tutorial.ipynb)** - Learn AION basics interactively, no local setup required!
-For detailed guides, see the [online documentation](https://polymathic-ai.github.io/AION/).
## 🔬 Scientific Overview
@@ -114,11 +98,11 @@ AION-1’s tokenizers cover **39 distinct data types**, grouped by survey and da
### 📈 Model Variants
-| **Variant** | **Encoder Blocks** | **Decoder Blocks** | **Model Dim** | **Heads** | **Total Params** |
-|------------:|-------------------:|-------------------:|--------------:|----------:|-----------------:|
-| **Base** | 12 | 12 | 768 | 12 | 300 M |
-| **Large** | 24 | 24 | 1024 | 16 | 800 M |
-| **XLarge** | 24 | 24 | 2048 | 32 | 3 B |
+| **Variant** | **Encoder Blocks** | **Decoder Blocks** | **Model Dim** | **Heads** | **Total Params** | **Model** |
+|------------:|-------------------:|-------------------:|--------------:|----------:|-----------------:|-----------|
+| **Base** | 12 | 12 | 768 | 12 | 300 M | [aion-base](https://huggingface.co/polymathic-ai/aion-base) |
+| **Large** | 24 | 24 | 1024 | 16 | 800 M | soon |
+| **XLarge** | 24 | 24 | 2048 | 32 | 3 B | soon |
> **Pretraining**
> – Global batch size: 8 192
@@ -241,7 +225,8 @@ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file
## 🌟 Acknowledgments
-AION is developed by [Polymathic AI](https://polymathic-ai.org/), advancing the frontier of AI for scientific applications.
+AION is developed by [Polymathic AI](https://polymathic-ai.org/), advancing the frontier of AI for scientific applications. We would like to acknowledge the support of the Simons Foundation and of Schmidt Sciences. This project was provided with computer and storage resources by GENCI at IDRIS thanks to the grant 2024-GC011015468 on the supercomputer
+Jean Zay’s H100 partition. Additionally, some of the computations in this work were run at facilities supported by the Scientific Computing Core at the Flatiron Institute, a division of the Simons Foundation.
## 📬 Contact