Please use this identifier to cite or link to this item:
http://dx.doi.org/10.25673/119225
Title: | Exploration of the Efficiency of SLM-Enabled Platforms for Everyday Tasks |
Author(s): | Rusinov, Volodymyr Basenko, Nikita |
Granting Institution: | Hochschule Anhalt |
Issue Date: | 2025-04-26 |
Extent: | 1 Online-Ressource (6 Seiten) |
Language: | English |
Abstract: | This study explores the potential of Small Language Models (SLMs) as an efficient and secure alternative to larger models like GPT-4 for various natural language processing (NLP) tasks. With growing concerns around data privacy and the resource-intensiveness of large models, SLMs present a promising solution for research and applications requiring fast, cost-effective, and locally deployable models. The research evaluates several SLMs across tasks such as translation, summarization, Named Entity Recognition (NER), text generation, classification, and retrieval-augmented generation (RAG), comparing their performance against larger counterparts. Models were assessed using a range of metrics specific to the intended task. Results show that smaller models perform well on complex tasks, often rivalling or even outperforming larger models like Phi-3.5. The study concludes that SLMs offer an optimal trade-off between performance and computational efficiency, particularly in environments where data security and resource constraints are critical. The findings highlight the growing viability of smaller models for a wide range of real-world applications. |
URI: | https://opendata.uni-halle.de//handle/1981185920/121183 http://dx.doi.org/10.25673/119225 |
Open Access: | ![]() |
License: | ![]() |
Appears in Collections: | International Conference on Applied Innovations in IT (ICAIIT) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2-8-ICAIIT_2025_13(1).pdf | 902.69 kB | Adobe PDF | ![]() View/Open |