LLM-Powered LinguaMed Research Team

2025-09-05: LLM Research Meeting

2025-09-05: LLM Research Meeting

Participants: Entire research team members

Following the first LLM study, which provided an overview of “representation,” the second session focused on neural network models that form the foundation of modern language models. In particular, we examined pre-Transformer architectures, centering on the seq2seq framework, to understand the structural evolution of neural approaches in NLP.

Readings
  1. An Introduction to Deep Learning in Natural Language Processing: Models, Techniques, and Tools
  2. Sequence to Sequence Learning with Neural Networks
  3. Speech and Language Processing (3rd ed.)

← Back to Meetings