AI Research
Posted by ASI-Enjoyer@reddit | LocalLLaMA | View on Reddit | 54 comments
Do we still need AI research, or is ASI just a matter of scaling? I'm 17 years old and I want to become an AI researcher. I want to know your opinion/get advice
georgejrjrjr@reddit
Yes, more research is needed, some but far from all of which is scaling research.
Even if AI is a solved problem by the time you get to college (unlikely), your github will always attest to you working on interesting / zeitgeist-y problems before you were legally an adult. That is something top tech employers look for --or the Thiel Fellowship, which is at least as prestigious (like a scholarship, but for not going to college).
There is a ton of low-hanging fruit you can work on, from home, as a 17 year old. For instance, the synthetic data pipelines and post-training workflows available in the open source domain are insufficient, yet there has been a ton of research on this problem you could iterate on and deploy. Models doing things in virtual environments like Minecraft, also a thing. If you have a passion for linear algebra and pre-training, there is a bunch of tricks coming from speedrunners at GPT-2-Medium scale training that you could make a name for yourself by testing at incrementally larger scale and longer training horizons.
stuehieyr@reddit
It’s not just matter of Scaling. OpenAI has made an unstoppable ball meets unmovable wall kinda situation forcing AI to keep learning and learning and learning to converge it on super intelligence.
Loya_3005@reddit
If I were you I would look into research areas related to AI Safety and alignment. I believe expertise in these areas will be crucial moving forward. Here's a youtube resource to get started mate: https://www.youtube.com/@RobertMilesAI
Bastian00100@reddit
Read the latest paper on transformers² (TITAN) and answer yourself.
ASI-Enjoyer@reddit (OP)
I don't think this is such a big breakthrough, new architectures are constantly coming out, which are attributed to a gigantic future, but it has never been true. If it's going to be used, then it's just memory, not something super cool
Bastian00100@reddit
It is indeed a breakthrough and yes these happen often. Doesn't this answer to your question? It's not just a matter of scaling models
damhack@reddit
There are lots of new techniques for improving LLMs but that doesn’t address two critical components of practical, impactful AGI - replicating other functions of biological brains necessary for cognition, world modelling and reflexive action; operation at low power to ensure ubiquitous application.
The main problem with LLMs is that they are trained with signals from human artifacts (observations), yet those artifacts (like words, audio, video) do not contain the entirety of knowledge or cognition. They are all low bandwidth intermediary media of communication and not the content of our actual experiential knowledge or thoughts which are ineffable. For example, research shows that language centers in the brain remain quiet when we reason. This is partly the Chomsky argument against LLMs being intelligent.
suprjami@reddit
Forget AGI/ASI, it's a meme and won't happen with current LLM tech. Transformer architecture is a dead end.
We're currently in an AI boom. These boom-bust cycles have been going on since the 1960s. See AI winter.
Multiple people in LLM tech and market analysts expect the funding for AI to dry up between 2026 and 2029, which will then result in another long period of AI winter.
This means by the time you're 21 years old, skills in AI/ML will be in an unmarketable field with no jobs available.
We're also in a period of downturn in the tech industry, the worst it's ever been. If that trend continues through the next ~4 years (likely) then you might find yourself in a difficult spot.
Personally I would pick a field with more potential for future employment, but you choose what you want. You're very young as far as employment history grows and you can afford to make a few slightly wrong turns.
If you do choose a career in ML research, then be sure to diversify your skills. Don't just focus on ML. Learn general related skills like software development and math and computer science. That will put you in a much better position to weather the next AI winter.
Consider watching Patrick Boyle and Asianometry on YouTube to get good finance and tech history based idea of where the industry has been, and is, and where it's likely to head.
a_beautiful_rhind@reddit
Damn, say it ain't so. I thought I had a useful hobby for once.
SeymourBits@reddit
“Forget a career in ML and become a plumber,” such gilded wisdom could only come from Skynet, cleverly posing as a “helpful Internet person.”
ASI-Enjoyer@reddit (OP)
AI winter is no longer possible, previously it was caused by the fact that AI had no use in business, now it has not been the case for a long time. AI is very useful, even if development stops at the current moment. Moreover, there is no reason why development should stop. Even if the most hyped methods hit the wall, there are a large number of researchers working on other and progress will continue
_idkfa@reddit
Finally a sane response!!!
SnooPeripherals5313@reddit
A rare level headed response, thank you
bbbar@reddit
I think there is a huge room for improvement in AI because the average human brain consumes 25W of power, while computers under AI eat 600W or more. There are a ton of improvements possible there i hardware and software, and it won't happen tomorrow
Horror-Librarian7944@reddit
Probably because we have millions of years of pretraining (evolution) and also millions of years of developing compression techniques
MapleMAD@reddit
ASI-Enjoyer@reddit (OP)
gwern is a writer who has nothing to do with research. You can attach any random opinion from the Internet, it's no better
qrios@reddit
Yes.
Also yes.
The explicit goal of the field is to obsolesce itself. Make of that what you will.
Ray_Dillinger@reddit
There are lots of discoveries to be made.
We're making some big ones now, in terms of learning how to get results.
There are a bunch more big ones in the immediate future, I think. At this point we're trying stuff almost at random, or because somebody has a random idea on an off Thursday afternoon, and a significant fraction of the time it turns out to be better. This will probably last for another five or ten years, at a guess. People are inspired by structures found in animal brains and human brains, and people are driven by the random-seeming problems they run into and made-up theories about why those problems are happening, and even if their theories are wrong it often happens that whatever they try turns out to have a benefit - maybe not the one they imagined or maybe not for the reason they imagined, but this is wild west days and you can discover new things just by looking over the next hill.
But after wild west days are over, for more than a lifetime there will still be plenty of work for researchers to do. It'll be rigorous, and harder, but firm theories will emerge. Discoveries will get more subtle and the results less immediately apparent to the consumer, but probably more profound and significant as we approach the efficiency of biological brains. Along the way we're going to have a lot of profound questions to answer, like maybe agreeing about what it would mean to answer questions about consciousness.
So if it inspires you, go for it. But staying on top of it through the wild west days is going to take a lot of work.
But it's important work. We're figuring out AI, and we'd better do a good job because AI is also going to figure out us.
The_GSingh@reddit
Pursue it. YOLO.
But to answer your question fr, OpenAI thinks it’s actually achieved agi in the past and that it has something big building.
In the 1% chance it’s not all hype, this means agi will already help aid ai development starting this year. Eventually ASI will emerge at which point you likely won’t have to work. You an everyone else actually.
Whether that means universal basic income or homelessness remains to be seen. Again take this with a grain, nvm a ton, of salt cuz it came from Sam Hypemans company.
Ylsid@reddit
OAI's definition of AGI is generating 100B in revenue too.. Lmao
DifficultyFit1895@reddit
where is the energy for all this AGI going to come from?
IxinDow@reddit
search "reversible computing"
The_GSingh@reddit
Ask agi.
All jokes aside, nuclear power is likely the answer. Plus if you have agi/asi iterate on nuclear energy it could make that even more efficient and power itself.
The thing about agi/asi is its general. It can answer your questions like a chatbot, but it can do so much more. One of these things could include energy management.
Fluffy-Feedback-9751@reddit
Nuclear fusion innit
Ylsid@reddit
Even if ASI is a real thing, we'll need AI researchers. Perhaps even more than now
Junior_Ad315@reddit
If you are interested, pursue it fully. It will be worth it for your personal enrichment and understanding, even if we get ASI or something like it in the next few years. And if that does happen, everything will be so different that no predictions or career plans made right now will really matter, so just do what interests you.
frivolousfidget@reddit
Love this answer. Steve Jobs (apple founder) has a great talk about you only being able to connect the dots looking back never looking forward. And I can agree more. Follow the stuff that you love (with an eye on the market ofc) and when the time comes the dots will connect when you look back.
clduab11@reddit
Just to kinda chime in with an anecdote with this advice; I’ve taken dozens of pages of notes throughout all my learning over the past 5 and a half months since starting…and there’s stuff I’m looking back on that makes a lot more sense that plugs holes where I had blinders on, and showed me that some of the things I wanted to do were WAY too complicated for my skill set, and now I know where to focus my learning in order to recognize those dreams.
You’ll be starting and stopping and starting over again and configuring and reconfiguring and while it feels tedious and irritating, just remember you can always pivot off for something to refresh your brain cells. It’s crucial practice. Soon tearing stuff down and rebuilding will be second nature.
I’m excited to see what the next 6.5 months brings that will bring me full circle to a year of starting genAI work and engineering, and I know that I’ll be learning for a lifetime. If you’re truly as fascinated by it all as most denizens of r/LocalLLama, you’re gonna have a great journey.
xchgreen@reddit
TLDR: 1) Yes we do. 2) No it's not a matter of scaling.
NB: Lol, ‘do we still need AI research?’ – I wish we didn’t. Actually, I wish we had real AI research to begin with (depends on definition). Right now, we’re kind of stuck in the ‘Chinese Room’ phase, bareeeeeeeeeeeeeely scratching the surface. (and we might even realize that was the wrong surface to scratch at some point).
Go for it.
Go for anything you like, - physics, ancient literature, botany, or 'AI' (however you define it), - anything. Just make sure you're happy doing whatever you're doing.
Pleasant-PolarBear@reddit
With all the new ai tools learning has become easier than ever. Right now I'm learning machine learning along side my formal cs education by watching lectures on YouTube, giving those videos to notebooklm, and having homework assignments generated for me.
TheRealMasonMac@reddit
There are people researching trees (computer science). A professor in my uni got a lot of recognition for their work on it.
alby13@reddit
we seriously need AI researchers. it is not just a matter of scaling. anyone who says "no one knows" doesn't know what they are talking about
mpasila@reddit
Transformers will not be enough for AGI/ASI or whatever. It's starting to reach its limits I'd say.
LumpyWelds@reddit
Considering the continuous deluge of papers on expanding contexts, improving transformers, reducing power consumption, new forms of quantization, etc. I'd say there is plenty to work on.
sk-sakul@reddit
There is no real AI, so there clearly is quite a pile of research to be done...
ASI-Enjoyer@reddit (OP)
What do you think is "real AI"? Why isn't o3 one of them? Why isn't his interactive enhancement one of them?
sassyhusky@reddit
There are multiple levels of AI and the “real AI” hasn’t been developed yet. Most people consider AGI to be “real AI”, LLMs are not it, they are missing a crucial component.
Scooter_maniac_67@reddit
AGI/ASI may arrive in 1 year, or it may be 5 years or maybe you'll be the one to invent it in 8 years. Get studying.
No_Afternoon_4260@reddit
What i think I understand is that llm and text is "easy" because you can find so much text out there, good annotated videos is a bit harder, so how do you get a model to understand physics? I mean there is simulation of course, but that doesn't get you that far on some real world use case. Idk just trying to paraphrase something I heard Yann Lecun say
ASI-Enjoyer@reddit (OP)
Yes, I'm thinking about it too, there's probably still a lot of work to do in understanding the video.
cromethus@reddit
AI research is still in its infancy. A hundred years from now they'll still be researching and refining.
MixtureOfAmateurs@reddit
We're gonna need ai researchers to answer that question. Btw I'm 18 on the same path, gonna fight you for a job one day lmao
LagOps91@reddit
Hard to say. I'm sure new approaches will be needed, but the question is how long it takes until AI outperforms you at AI research. Since you haven't studied yet or worked in the field, it's quite possible that the AI will improve faster than you will be able to learn.
I don't want to discourage you at all, just pointing out possible pitfalls. AI is improving at such a rapid pace, nobody knows if it will just continue or hit a wall. It's very uncertain where we end up.
Healthy-Nebula-3603@reddit
AI already hit the wall and is claiming on that wall ...fast
LagOps91@reddit
yes, for now. but will that hold true for the future? that's still up in the air.
Healthy-Nebula-3603@reddit
For the current progress and new papers...we don't see a wall and right now are very close to AGI already ...
All signs show for the time being no walls ...
Alternative-Hat-2733@reddit
guessing AI will learn faster than you from this point on. relax
uwilllovethis@reddit
Investments in AI will increase massively in the upcoming years. There is no better time to pursue a career jn AI.
No_Afternoon_4260@reddit
Also know that we are just starting to tackle the memory part of ai, contex is like short therm memory, long therm memory might be implemented through latest google paper called titan. Multimodality is kind of meh (someglorified OCR and vision recognition..) Also a lot of work has to be done in interpretability.. (like wtf is happening inside that black box)
Cloakk-Seraph@reddit
Fuck me this is an alarming question.
amang0112358@reddit
Scaling will require research. Day one of "ASI" is not the end of human-conducted AI research.
Healthy-Nebula-3603@reddit
If you are doing it for yourself yes go for it.
If for money ... I wouldn't go this way ... if they do not execute soon it will be AGI and a bit later ASI before you finish education...
Radiant_Dog1937@reddit
No one knows if they are being honest. Just carry on as if the work will still be necessary until such a time that's proven otherwise.