is it possible to still rawdog programming ?
Posted by UngodlyKirby@reddit | learnprogramming | View on Reddit | 80 comments
Hi, I 17F is a first year computer science student and I’m currently learning C as my first language in an academic setting.
Other languages I have played around with are python, css, html and javascript. I wouldn’t say I have a strong foundation in any of these languages but I’ve dabbled a bit in them. I’m pointing out my coding/programming background to show I barely have any knowledge, when I was learning those languages I barely had any projects except when I was learning html and css in which I posted very beginner like web pages, task bars etc.
I really don’t want to get dependent on AI due to the fact on different subreddits I see people say they hire swe’s or software developers and they aren’t able to code at all, I don’t want that to be me, even though AI has been around for a while now I want to act like it’s still 2010s-2020 when people were learning how to code without the use of tools like that, another reason is that my degree is more tailored to practical and applied programming than it is to theory and mathematics, towards my second semester of first year and second year I’ll be doing less of mathematics & computer science theory and more of Data Structures and Algorithms, Computer Architecture, Object Oriented programming, Databases. I don’t want to GPT my way through this degree, I want to know why and how things work, I want to be able to actually critically think and problem solve, I’m not saying people who use AI cannot do this, I’ve heard several senior developers implement these tools in their day to day activities, but I’m saying as a beginner with a foundation which is not so sturdy, if I do rely on AI as a tool or teacher, I might get too dependent on it maybe that’s just a skill issue on my end 😅.
I noticed C is a bit different from these languages cause C is more backend language and is used for compiling, I wouldn’t say it’s a hard language to learn but it’s definitely tricky for me, I don’t really want to use AI to learn it, apart from W3Schools and Youtube videos which other resources like books, blogs, websites can I use to learn this language?
PoMoAnachro@reddit
You've absolutely got the right approach here.
I tell my students they may well use AI when they're working in industry as developers because it can be good for automating away easy or trivial problems.
But the problem you're trying to solve in school is "how do I grow my brain and develop problem solving skills and mental endurance?" and getting AI to solve things actively works against that development. You want to be conditioning your brain to long periods of mental focus on complex things - you're working out your brain just like a powerlifter goes to the gym to build muscle. Getting someone else to lift the weights for you - or getting AI to solve problems for you - actively works against what you're trying to do.
Anyway, you absolutely can learn without AI and you'll learn faster and better without it (AI is good for tricking people into thinking they've learned something when they really haven't). Learning from books (while constantly trying things out yourself and seeing how they work) remains one of the best ways (second really only to being guided through by a quality instructor).
callmejenkins@reddit
Tell your students they may use AI in the future, I've interviewed with 3 companies so far that request I use it, but they still have to know what they are doing. My interviews were graded on being able to explain why I specifically prompted the AI something, and explain it in terms of n and memory, as well as explaining the AI's code accurately. So, while AI can and is very useful (unless your FAANG or doing research unlike the vast majority of us) you still need the core skills to use it.
Small_Dog_8699@reddit
The more you use AI, the more you degrade those core skills.
https://www.forbes.com/sites/chriswestfall/2024/12/18/the-dark-side-of-ai-tracking-the-decline-of-human-cognitive-skills/
Happiest-Soul@reddit
That was my initial thought reading your reply.
As easy as it is to degrade your ability (handle all outputs), it's just as easy to use it to help augment your abilities (act as a mentor).
As a beginner, the latter has been amazing for me.
Eventually, I'll probably be doing a mix of both given market trends.
callmejenkins@reddit
Sure, and calculators stopped everyone from doing math.
bravopapa99@reddit
They did.
https://www.calculatorlibrary.com/blog/calculators-and-mental-math-skills
"""
Recent studies have found that there may be a correlation between calculator use and decreased mental math abilities. One study by researchers at the University of Cambridge found that students who regularly used calculators on math tests had lower scores than those who didn't. Additionally, a study published in the Journal of Educational Psychology found that students who were allowed to use calculators on math tests had a more challenging time solving math problems without using a calculator.
"""
Plenty more where that came from.
Consistent_Cap_52@reddit
Some truths, some myths. Elementary students learning arithmetic, should not be using calculator. Doing advanced maths without a calculator is ridiculous.
bravopapa99@reddit
Agreed. I used a calculator a lot for A_level maths.
PoMoAnachro@reddit
This is the big difference between using something as a tool as a professional and using it as a student building your mind.
Obviously, calculators are super useful for people adding numbers all day.
But there's a reason we teach children the number line and how to count and basic arithmetic before letting them use calculators to solve problems.
If you just handed children calculators in kindergarten and said "punch in these symbols and then write down whatever symbols the machine gives you back" they'd never learn to count, never mind learn arithmetic. And then suddenly you have children who can never grow up into the types of minds that make calculators.
"Sure, and calculators stopped everyone from doing math" is right up there with "Sure, and cars made everyone stop walking places and contributed to an obesity epidemic" or "Sure, and social media made everyone stop leaving their house to socialize and led to declining social skills".
Small_Dog_8699@reddit
They made you slower.
Ever bid at an auction? The guy with fast figures in his head will beat you every time on any time dependent deal.
The calculator argument is fallacious. The range of skills is nowhere near comparable. But FWIW, in engineering school, they pushed head math hard because when your oil well is shooting oil into the sky, figuring how to mix the fluid weight to tamp it down isn't the kind of thing you want to go back to the office to figure out.
callmejenkins@reddit
It's absolutely not fallacious, It's just simplistic. If you're gonna post reactionary fear-mongering articles so Forbes can get clicks, you're going to get a dismissive response from me. It's just as bad as the propaganda articles.
imkindathere@reddit
AI is super useful in research as well
alcatraz1286@reddit
do you prohibit google too or is it somehow acceptable
Key-Seaworthiness517@reddit
You realize you can google lessons and not just raw code, right?
fhigurethisout@reddit
Hmm. On the flipside, I really thought I was too stupid for certain concepts in programming until AI came along to explain it to me better than any professor or textbook has.
You can also ask it to give you assignments in order to practice those concepts.
I agree that without is better overall, but I don't think we should pretend that everyone will have better outcomes just on their own. AI is a solid and patient teacher.
PoMoAnachro@reddit
If you don't have access to good instructors, getting AI to teach you is certainly better than nothing. It can be good for explaining and giving you things to practice and quizzing you, etc.
The main thing I'd recommend against students using it for is using it to generate code for them. That really undermines their learning. And inevitably I find many students intend to start out using AI just as a tutor but when they get stuck they ask to see the solution instead of working their way through it and rob themselves of a lot of development. The "getting frustrated and chipping away at a problem until you figure it out" is a huge part of programming and building that kind of mental endurance is really important.
KwyjiboTheGringo@reddit
Right, I've been saying this to many new developers. When the goal is to be as productive as possible, then AI can be a performance enhancer if you use it wisely and understand its limitations. If the goal is to learn, then AI is a last resort for when you just can't understand something after articles, books, and people in chats have tried to explain it to you. And there are obviously many exceptions to this, but they pretty much all require a deeper understanding of the field than a new developer will have.
guywithknife@reddit
When learning, you should absolutely not use AI. Failing and struggling through a problem and figuring it out for yourself is an important part of learning.
The only valid part of using AI while still learning is to ask it to explain concepts to you, because (unlike with a human who would eventually lose their patience) you can keep asking it to explain in a different way until you finally understand.
But you should not under any circumstances get it to write and of your code for you, and that includes copilot/auto completion.
The quality of AI written code is also still complete garbage. Don’t rely on it.
Pale_Height_1251@reddit
Of course, just pretend AI doesn't exist if you don't want to use it.
When we say "back end" and "front end" we generally mean for web, and C is seldom used for web backends.
"Back end" doesn't mean "technical" or "not in the browser" it really refers to the backend of a website or maybe an API server. C is rarely used for those.
Treemosher@reddit
Oh yeah, nothing has happened in the world that would make it impossible to learn programming without AI. In other words, there's nothing mandatory about using AI, unless it's directly tied to the project you're working on.
AI is one tool among many. Humans go through this type of conversation every time a new technology enters the ring.
When digital photography became a thing, are you a real photographer if you don't use film?
Look up controversies when computers were introduced in households. You could replace computer with AI and see very similar fears being expressed and conversations in the media.
If anything, it just might be easier to stand out if you took the time to learn things properly.
SnugglyCoderGuy@reddit
Yes. AI is shit.
HasFiveVowels@reddit
Wow. The mandatory anti-AI comments are getting really lazy.
AlSweigart@reddit
You call them lazy because you can't call them inaccurate.
SnugglyCoderGuy@reddit
:thumbsup
je386@reddit
AI can be a good tool, but when someone is learning something new, it can stand in the way instead of helping.
Kaenguruu-Dev@reddit
Or, to quote the person you replied to:
SilkTouchm@reddit
AI isn't shit.
AlSweigart@reddit
I do.
az987654@reddit
I commend you for picking up C
johanngr@reddit
What was easiest for me was to learn how to build a computer from scratch with transistors. This allowed me to "know why and how things work, be able to actually critically think and problem solve" and I was inspired to learn that way from a friend who I noticed seemed to be able to think while many "programmers" seemed to not be able to think (he was electronics+hardware interested). I played through all of https://nandgame.com and that really helped since after you play through that you have built a computer from scratch. I also later found the game Turing Complete on Steam, playing through that would make you understand all the low-level things. And I built my own 8-bit computer in hardware description language after that which helped cement all of it. Courses in hardware description language, electronics and Assembly (and C/Assembly) and "embedded" as they call it probably good to really be able to reason about things. I think many "programmers" underestimate how helpful it is to actually know what the computer is and how the computer works.
pat_trick@reddit
This is more electrical engineering than computer science, but the two do have very heavy crossover. You can learn a lot by programming at the assembly level.
johanngr@reddit
I simply explained what worked best for me to learn "programming". 99% of what I describe is logical gate circuit and normal computer engineering, and the logical operations part of that is normal computer science. Learning electronics is good too. Noticing that opposite to the electron flow you have a flow of subatomic medium particles from pressure difference in the subatomic medium (from oxidation and reduction of atoms at root probably increasing and decreasing volume they occupy), and that you have similar flow perpendicular to "magnetic field" of magnets, which is why the magnet aligns in same direction ("right hand rule") as was understood by 1800s in the book Physical Lines of Force is meaningful and understanding electricity is meaningful in general but mostly for understanding transistor in my recommendation for what helped me learn "programming" most. Also useful for understanding how nonsensical understanding and belief system the average "scientific" person has. Peace
pat_trick@reddit
Easy dude, I was agreeing with you.
johanngr@reddit
It probably certainly used to be heavily towards electrical engineering when it all had to be done manually, but today there are great simulators. In my own lived experience, I've been able to get a good grasp on how CPU and RAM works without touching a lot of electronics for it. I would have preferred to work manually with real parts, but I could work at 1000x speed by using simulations or in hardware description language. So, in my own lived experience I do not agree that what I did was under "electrical engineering" umbrella but I can understand that 10-20 years ago it would have had to be (I also like electrical engineering but disagree that my recommendation was mostly about it, anyone also interested in yet another level down will just dive into that automatically).
johanngr@reddit
I just do not agree with that basic computer engineering has to be approached as being somehow "electrical engineering". It is its own thing. It can be approached without even touching electricity knowledge. I learnt how to build a computer with nandgame, then built an 8-bit computer in VHDL. I was not limited to "this has to be electrical engineering". So I simply disagree. But I also like electricity, I just do not think it is the only way to think about how computer works, you can build non-electrical computer too.
johanngr@reddit
My point was that hiding everything that actually helps a person "know why and how things work, be able to actually critically think and problem solve" behind "that is a different topic than programming" is why many do not find the easy way to learn "programming", because others decided "oh then there is that other whole world out there but don't worry about that". Like in Futurama when Bender says to Fry "oh and there is a closet too"... I think you are factually wrong in that logical circuit design is "electronics" and not "computer science". I disagree with that. So I mention that in my response. And I also mention that people misunderstand electricity to start with and maybe that is why there is so much separation of ideas, maybe people can't think in a whole way, they removed the subatomic particle medium and replaced it with "nothing" and have insane nonsensical models. So it was relevant to explain why people separate things into little hidden compartments so much. Maybe we could stop doing that.
tmetler@reddit
I strongly agree with this. There was a period when newbies getting into the industry questioned why the colleges taught this kind of stuff when they'd never actually use it on the job.
I think that learning how computers work under the hood is vital to building up a strong intuition behind how computers work. All the best engineers I know have very strong computer intuition.
It's something I took for granted because I grew up in the 90s when computers were a lot simpler and you were more exposed to how things are implemented. I learned a lot just by messing around. Tweaking things, breaking things, fixing things. I was exposed to a lot of low level abstractions that you just don't get exposed to today.
Since that's no longer an option, I think learning how a computer works from the ground up is a very effective alternative, even if that knowledge is relatively surface level.
I'd go further than that too. To really form expert level intuition I'd say learn:
It would take a lifetime to learn it all in depth, but you don't need to go that deep, just enough to get some intuition.
To really achieve greater success and job stability in this industry you need an insatiable curiosity and hunger to learn. That's why the common advice is to only get into the industry if you're passionate. Could you learn everything you need without passion? Sure, but will you really want to?
PopPunkAndPizza@reddit
"Learning" to code with an LLM is like going to the gym with a forklift - sure, you're responsible, indirectly, for more work being done than you otherwise would be, but the hard work was the part that was going to help you.
HasFiveVowels@reddit
… unless the whole point of going to the gym in the first place is to move heavy objects.
jameyiguess@reddit
Raw.. dogging... code...?
Original_Log_9899@reddit
C Primer Plus is the best book
Consistent_Cap_52@reddit
You shouldn't use AI to do your homework, however you should get accustomed to using it properly...it's not going anywhere.
MarionberryKooky6552@reddit
Expectations will probably eventually rise to match the use of AI. I'm a 3rd year CS student and I know, people always have cheated, and not everyone is passionate. But how do people do homework now....... is a nightmare. They just copy-paste gpt as much as possible. And the truth is, it's often possible to get decent grades this way. I sometimes have thoughts like "okay if they just use AI everywhere and I really know stuff maybe I will have advantage later". But I don't know if this will pay back enough.
Didn't really answer your question, just tangential to it
pat_trick@reddit
Those students are doing themselves a huge disservice and will suffer for it in the future.
Techno-Pineapple@reddit
If you're a first year computer science student, you don't need some blog to learn. Just engage with the provided learning materials. It will teach you far better and more important things than a blog, and it will help you get better grades too.
Lecturers almost always provide further learning links, texts, hints and ideas if you want to go the extra mile and self learn rather than just get a good grade.
Moloch_17@reddit
You know what? The kids are alright
tmetler@reddit
I've been in the industry for a long time at this point. I will say, there is one attribute for learning programming that eclipses all others. Curiosity.
Don't take I don't know for an answer. Getting to the next level means going down the rabbit hole one level at a time until you feel completely comfortable.
Feed your curiosity and you will get where you need to be.
I wouldn't eschew AI. It is an amazing learning tool. Ask it to explain things for you, then cross reference its answers. Use it as a launch pad for your learning, but don't rely on it as a source of truth. Ask it how to do something, then ask it for other ways to do it, then ask it how it works, then double check it by checking primary sources and by implementing the knowledge yourself.
If you outsource your thinking to AI you will never improve, but if you use it to accelerate your thinking you can learn faster than ever before.
This job is 90% learning, so be prepared to learn a huge amount and invest in learning how to learn. If you are not interested in learning as the core part of the job then this is not the industry for you, so ask yourself, do you have an insatiable curiosity to learn more?
abel_maireg@reddit
Literally a caption to my thought
Multidream@reddit
My university gave me a book, but to be honest I didn’t much care for it. What’s more important is to be exposed to a development environment where C is forced upon you.
My uni also had a course for Game Development on the Nintendo Game Boy. Which is exclusively C and attached libraries. People who took that course became truly comfortable with the language.
NotMyThrowaway6991@reddit
It's hard to google things without having AI slapped in your face. But you could just scroll past it
Ok_Court_1503@reddit
Others are giving great advice. What I will say is: AI does not teach, it is like a shitty friend to copy off of. AI can be a decent tool once your competent. Not using it early on will absolutely make you stronger
silajim@reddit
You should, especially when learning, you need to know what you you would do before asking AI to do it, so you can spot wrong things, bad logic, erroneous memory management or a better way of doing things, for me, the way to treat AI is to treat it like a junior, and then use it as a blueprint for improving what it has given you
MediumAd1205@reddit
I’m currently in a c# class in college and use AI, I even told my advisor that I was and to please drop me from the class so I can retake it by itself, unfortunately she will not so AI it is, a terrible professor first assignment was a win forms application without any PowerPoints only a single link to Microsoft api on c#. I hate that I’m using AI but I just wanna be done with the class, don’t let it happen to you stay away from AI has long has you can
deleted_by_reddit@reddit
[removed]
AutoModerator@reddit
Please, ask for programming partners/buddies in /r/programmingbuddies which is the appropriate subreddit
Your post has been removed
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
mayorofdumb@reddit
Programming is about learning how to create... I can rawdog it everyday as a person, I think this helped me out thinking about how you would beat organize something and then realize computera don't think that way. There's certain tricks used to move quicker and faster but you need to know what a computer likes to do at its core and how we've built around a simple concept of 1/0 to all of this.
Then AI will be the next step when you have problems that aren't 1/0. We haven't figured out the best way to guess essentially...
Most of early programming was doing more with less. C allows you to play with a computers memory and manipulate it better for specific circumstances, but the truth is that libraries and open source expanded everyone's capabilities to python where you can enter simple commands to do very complex things.
The complex code has been hashed out by the old guard to instill best practices, until you need something special for just your task.
Zalenka@reddit
Heck yeah! For any language the syntax and patterns are all not usually involving any libraries. There's a lot to the standard libraries of many languages (c is a bit different though as there isn't an API like java or Rust, you want more than the basic types, you need to write it).
Books are so great to learn programming. That's how I learned in the 90s. Just go through the tutorials and learn about things.
cheezballs@reddit
AI sucks at teaching. It can help explain something in a different way,but you gotta already know enough to generate the right prompt. Just get your hands dirty and you'll learn.
jurck222@reddit
https://www.learncpp.com/ This is all you need and then build projects and practice
Small_Dog_8699@reddit
He wants to learn C, not C++.
Try this one. https://www.w3schools.com/c/index.php
YetMoreSpaceDust@reddit
The third word in the post says "17F". That's a she.
ANGR1ST@reddit
We're on the internet ... so maybe.
YetMoreSpaceDust@reddit
Or an FBI agent, true.
DapperNurd@reddit
I recommend joining discord servers if you're into that, communities are a great way to learn.
cockmongler@reddit
Spend some time learning to touch type. Then you'll find waiting for AI to output code frustrating.
ANGR1ST@reddit
Textbooks and language references are handy.
In general AI isn't very good at solving problems ore making decisions. It's really good at writing text and translating or expanding things that already exist. So having a strong basis is problem solving and algorithms is going to be far more important than learning a ton of languages. From what I've seen the "translate this C code to XYZ language" stuff the AI does is pretty good.
allium-dev@reddit
For C in particular there are tons of good physical books. I'd really really recommend using one or more books. It's the best way to learn, imo, for a ton of reasons:
So yeah, my advice is get a good C book and work all the way through it. Then, after you've done that, you'll be much better set up to use any other resources to keep learning.
lordnacho666@reddit
Definitely lay off GTP until you are more experienced.
C will definitely be more complicated for you, since none of the languages you listed are typed or manually memory managed. So you would need to understand those two concepts to explain why the syntax is the way it is.
mxldevs@reddit
You don't need to use AI if you don't want to.
Just because everyone else is cheating with AI doesn't mean you have to.
YetMoreSpaceDust@reddit
K & R: https://www.amazon.com/Programming-Language-2nd-Brian-Kernighan/dp/0131103628/ref=sr_1_1?crid=3OS7KW8WYAPWL&dib=eyJ2IjoiMSJ9.77cd8O83JPnIBV1qb_ggS6RV82NdsTMU53EKdVryA9vRvwnDBy6XIHT4ggf9IltOfsqzBs0J7zfQMlTzKeX-FqMBZVWRu4-RzbxsoDsCmfkJ_fhfJj--OgXrd3eD4Mm7ooPYzE6dJeY1zlEVCyI546bpfjyp_LOMEtLAcwqPpunMMAakov-0hKNUy7zKwXBr1voHCJaUYNdDrHELzlaD_N-alViVXAN65nrfsjRSi2A.pUYJnOOv5rbjm7mLKDg9pzOSrwUJbiU23_eBHue_q_U&dib_tag=se&keywords=kernighan+and+ritchie+c&qid=1760554603&sprefix=kernigh%2Caps%2C147&sr=8-1
JoeyD54@reddit
33 yr old here. Undergrad in comp sci and have been a programmer professionally since 2016. 2 classes away from my Master's in real time systems. My two cents:
I find AI to be a good helper, but it shouldn't do all your work for you. Instead of saying "build this thing for me" I explain in detail what I'm doing with code snippets and ask for some specific kind of help. I ask for alternatives and why it chose them or I'd explain the problem I'm having with my code to see what solution it comes up with. I might take bits of code it generates, but very VERY rarely most to all of it. I notice that I think to myself "why the hell did it choose this design" when looking at code it generates. It may get the job done, but it's not easy to read or understand at times.
It's like having a constant paired programmer next to you that you can talk with. I'd recommend sticking to googling things. Maybe only go to specific sites and search there to fully avoid AI.
Getting a good base understanding of programming is still super important. Kudos to you for wanting to get it.
ArtisticFox8@reddit
Why wouldn't it be possible? Courses now are largely the same courses from 2010s
Solid_Mongoose_3269@reddit
Apply for a government position and comeback
GenSwiss@reddit
I think this is solid idea. I am an experienced developer, and I do use AI — recently however, I have found the allure of relying on AI in ways I don’t like. For example, I might have something I want to do and then just ask AI, it will generate some code (which is 100% guaranteed to be slightly off) and then just use it as a reference as I write my own code. But I don’t like this because I find myself not understanding as much of what I am doing.
The relevant part of your question is what I do when I notice this happening. I remember what I did before this: Read the docs and if necessary, the source code (if exists)!
You mentioned wanting to understand why and how things work and there is no better place than reading the docs and code. Once you have that down, you start writing some tests to confirm or invalidate your beliefs! If things blow up, read the stack tracker as best you can. If you want you can have an AI of choice help you with any strange language specific details (for example, Java stack traces sometimes have an
L
that precedes the class name, you might want to know what that’s all about, and relying on AI for this is an easy ask, while you stay in the weeds of your current problem).Additionally, AI has really helped me understand broad concepts better. Sometimes I ask for a refresher when I am in weeds, to make sure that my mental framework is correct (this forces me to comprehend what the AI outputs, and apply it to my specific situation).
huuaaang@reddit
Of course it is. What I do is turn off copilot/AI autocomplete and switch the AI sidebar in Cursor to "Ask" mode. That way I am less tempted to let AI generate code. I force myself to write stuff by hand and only use AI like an advanced integrated Google.
Also, regular IDE extensions for languages are still very useful. Like you can get far simply by having an IDE know what the available functions are you can call in a given context and their signatures.
Ronin-s_Spirit@reddit
No matter the language or end goal I use AI as a super googler. I will say "How can I do [some outlandish idea]?" or "Is there a different syntax/algorithm/structure to do XYZ?" then I end up on blogposts, documentations, stack overflow, (don't take AI at face value) and find something cool or fitting that I didn't know existed.
Also
freeCodeCamp
(the youtube channel of the website) is like a giant well of knowledge, I haven't paid attention to C so I can't tell for sure but - you may find some great learning material there.Medical_Amount3007@reddit
I am rawdogging code 9h for work and when I turn of the laptop I rawdog coding 8 hours into the evening. Rawdog away
Freecraghack_@reddit
I think if you are taking your time and want to build strong competence, then "rawdogging" is the way to go. LLM's are shortcuts that can be beneficial, but ultimately comes with side effects.
AnswerInHuman@reddit
You know how math applied in the real world is more about principles than anything since you can use a calculator to perform basic operations? Programming is kind of the same. It’s more about building a computer program that has a purpose, and that purpose is what defines whatever its value may be. The programming language is just a way to write things so that the computer understands what you want it to do.
Henbotb@reddit
https://gustedt.gitlabpages.inria.fr/modern-c/