Name: | Description: | Size: | Format: | |
---|---|---|---|---|
57.33 KB | Adobe PDF |
Advisor(s)
Abstract(s)
Large language models (LLMs) are becoming an integral part of
our daily work. In the field of ecology, LLMs are already being
applied to a wide range of tasks, such as extracting georeferenced data
or taxonomic entities from unstructured texts, information synthesis,
coding, and teaching (Methods Ecol Evol 2024; Npj Biodivers 2024).
Further development and increased use of LLMs in ecology, as in
science in general, is likely to intensify and accelerate the process of
research and increase publication output—thus pressuring scientists
to keep up with the elevated pace, which in turn creates a feedback
loop by promoting even greater LLM use.
However, this all comes at a cost. While not directly borne by end
users, aside from occasional response delays, LLMs require considerable computational power and are energy-demanding during both
their initial training phase and their subsequent operational use
(Nature 2025). Furthermore, partly externalized energy costs are
linked to intensive searching and processing of discovered sources as
part of Deep Research. Currently, it remains challenging to estimate
the total energy costs of LLMs, largely due to limited transparency
from their companies of origin.
Description
Keywords
Large language models Ecology
Pedagogical Context
Citation
Publisher
John Wiley & Sons