Why isn't the whole industry focusing on online-learning?

Posted by unraveleverything@reddit | LocalLLaMA | View on Reddit | 18 comments

LLMs (currently) have no memory. You will always be able to tell LLMs from humans because LLMs are stateless. Right now you basically have a bunch of hacks like system prompts and RAG that tries to make it resemble something its not.

So what about concurrent multi-(Q)LoRA serving? Tell me why there's seemingly no research in this direction? "AGI" to me seems as simple as freezing the base weights, then training 1-pass over a LoRA for memory. Like say your goal is to understand a codebase. Just train a LoRA on 1 pass through that codebase? First you give it the folder/file structure then the codebase. Tell me why this woudn't work. Then 1 node can handle multiple concurrent users and by storing 1 small LoRA for each user.

Directory structure:
└── microsoft-lora/
    ├── README.md
    ├── LICENSE.md
    ├── SECURITY.md
    ├── setup.py
    ├── examples/
    │   ├── NLG/
    │   │   ├── README.md
...


================================================
File: README.md
================================================
# LoRA: Low-Rank Adaptation of Large Language Models

This repo contains the source code of the Python package `loralib` and several examples of how to integrate it with PyTorch models, such as those in Hugging Face.
We only support PyTorch for now.
See our paper for a detailed description of LoRA.
...


================================================
File: LICENSE.md
================================================
    MIT License

    Copyright (c) Microsoft Corporation.

    Permission is hereby granted, free of charge, to any person obtaining a copy
    of this software and associated documentation files (the "Software"), to deal
    in the Software without restriction, including without limitation the rights
    to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
    copies of the Software, and to permit persons to whom the Software is
    furnished to do so, subject to the following conditions:
...