Skip to content

Running Teuken-7b with Ollama: A Step-by-Step Guide

Core Problem

Teuken-7b, a popular large language model, has been facing issues with its compatibility with the new ollama framework. This problem has been frustrating for developers and users alike, who are eager to harness the power of Teuken-7b but lack guidance on how to do so.

Solution & Analysis

Meanwhile you can use llamafile to get Teuken running.

# Install llamafile
pip install llamafile

# Create a new file with the following content:
# <TEUKEN_7B_CODE_HERE>

# Run the code using llamafile
llamafile --run <TEUKEN_7B_CODE_HERE>
To achieve this, you can check out https://www.johannesholstein.de/gsCMS/index.php?id=neue-bundes-ai for a short tutorial with links and all on how to achieve that.

Conclusion

In conclusion, running Teuken-7b with ollama requires some technical expertise and patience. By using llamafile and following the provided steps, you can overcome the compatibility issues and start harnessing the power of this large language model.

Reference