LLM Service
With their research data, the text- and language-based humanities offer a wide range of applications for the use of large language models (LLMs). Text+ enables access to such research data via the registry, the Federated Content Search (FCS) and via the contributing data centers and their repositories.
In addition, the GWDG provides a web service for the open source LLMs (Meta) LLaMA, Mixtral, Qwen and Codestral. In this way, the National Supercomputing and AI Centre supports the development and testing of research-related AI use cases in Text+.
Application Scenarios
In the text- and language-based humanities, LLMs are used as powerful tools to support research and gain new insights. In doing so, it is important to take ethical considerations and compliance with data protection regulations into account, especially when dealing with sensitive or copyrighted data. An overview of various application scenarios both within Text+ and in the general context of language and text-based research data can be found in the Topics and Documentation section.
Who Can Use the LLM Service and How?
After logging in via Academic Cloud, initially all persons directly involved in the project can use the service. This is due for licensing reasons. An expansion of the user base is planned for the future. All LLMs are hosted on GWDG servers so that no data flows externally.
Video-Tutorial
Benefits
- Free use of many models
- AI chat without server-side storage of chat history
- LLMs with access to knowledge bases, e.g. Wikipedia and Wikidata
- Retrieval-Augmented Generation (RAG) with your own documents
- Compliance with legislative requirements and especially the privacy interests of users
Feedback Wanted
Despite the described advantages, this integration of LLMs has the status of a first offer, which should be supplemented and improved over time both in terms of functionality and accessibility. Feedback on the current version is therefore explicitly wanted! Please send it to us using the contact form.
Changelog
- February 2025: further improvements user interface, integration in portal navigation finished
- October 2024: initial deployment of service, including tethering to Academic Cloud as well as backend infrastructure