this post was submitted on 05 Feb 2025
512 points (96.7% liked)

Greentext

5045 readers
1399 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 1 day ago

They're clever. Cheaters, uh, find a way.

[–] [email protected] 18 points 2 days ago* (last edited 2 days ago)

Brainless GPT coding is becoming a new norm on uni.

Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know how does it work?

Not to mention, you won't cheat out your way on recruitment meeting.

[–] [email protected] 9 points 1 day ago

Anon volunteers for Neuralink

[–] [email protected] 82 points 2 days ago (3 children)

The bullshit is that anon wouldn't be fsked at all.

If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that's called "studying".

[–] [email protected] 26 points 2 days ago

Professors hate this one weird trick called "studying"

[–] [email protected] 6 points 2 days ago (3 children)

I don't think that's true. That's like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.

[–] [email protected] 8 points 2 days ago (2 children)

I don't think that's quite accurate.

The "understand it well enough to explain it to a professor" clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you're actually learning something.

Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.

[–] [email protected] 1 points 1 day ago* (last edited 6 hours ago)

You don't need physical skills to program, there is nothing that needs to be honed in into the physical memory by repetition. If you know how to type and what to type, you're ready to type. Of you know what strings to pluck, you still need to train your fingers to do it, it's a different skill.

[–] [email protected] 3 points 1 day ago* (last edited 1 day ago)

I didn't say you'd learn nothing, but the second task was not just to explain (when you'd have the code in front of you to look at), but to actually write new code, for a new problem, from scratch.

[–] [email protected] 2 points 1 day ago

No he's right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You'll obviously memorize some common stuff but things change really fast in the programming world.

[–] [email protected] 3 points 2 days ago (1 children)

It's more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.

Code from ChatGPT (and other LLMs) doesn't usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.

It's easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.

[–] [email protected] 3 points 2 days ago

I mean I feel like depending on what kind of problems they started off with ChatGPT probably could just solve simple first year programming problems. But yeah as you get to higher level classes it will definitely not fully solve the stuff for you and you'd have to actually go in and fix it.

[–] [email protected] 15 points 2 days ago (1 children)

Yeah, if you memorized the code and it's functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it... You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it

load more comments (1 replies)
[–] [email protected] 108 points 3 days ago (32 children)

Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.

[–] [email protected] 7 points 2 days ago
  1. Ask ChatGPT for a solution.
  2. Try to run the solution. It doesn't work.
  3. Post the solution online as something you wrote all on your own, and ask people what's wrong with it.
  4. Copy-paste the fixed-by-actual-human solution from the replies.
[–] [email protected] 12 points 2 days ago

If we're talking about freshman CS 101, where every assignment is the same year-over-year and it's all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his "explanations", but they're probably tired from their endless stack of work, so why bother?

If we're talking about a 400 level CS class, this kid's screwed and even someone who's mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.

[–] [email protected] 2 points 1 day ago

Two words: partial credit.

load more comments (29 replies)
[–] [email protected] 211 points 3 days ago* (last edited 3 days ago) (16 children)

https://nmn.gl/blog/ai-illiterate-programmers

Relevant quote

Every time we let AI solve a problem we could’ve solved ourselves, we’re trading long-term understanding for short-term productivity. We’re optimizing for today’s commit at the cost of tomorrow’s ability.

[–] [email protected] 33 points 3 days ago* (last edited 3 days ago) (7 children)

I like the sentiment of the article; however this quote really rubs me the wrong way:

I’m not suggesting we abandon AI tools—that ship has sailed.

Why would that ship have sailed? No one is forcing you to use an LLM. If, as the article supposes, using an LLM is detrimental, and it's possible to start having days where you don't use an LLM, then what's stopping you from increasing the frequency of those days until you're not using an LLM at all?

I personally don't interact with any LLMs, neither at work or at home, and I don't have any issue getting work done. Yeah there was a decently long ramp-up period — maybe about 6 months — when I started on ny current project at work where it was more learning than doing; but now I feel like I know the codebase well enough to approach any problem I come up against. I've even debugged USB driver stuff, and, while it took a lot of research and reading USB specs, I was able to figure it out without any input from an LLM.

Maybe it's just because I've never bought into the hype; I just don't see how people have such a high respect for LLMs. I'm of the opinion that using an LLM has potential only as a truly last resort — and even then will likely not be useful.

load more comments (7 replies)
[–] [email protected] 41 points 3 days ago

Hey that sounds exactly like what the last company I worked at did for every single project 🙃

load more comments (14 replies)
[–] [email protected] 103 points 3 days ago (1 children)
[–] [email protected] 66 points 3 days ago (1 children)

Probably promoted to middle management instead

[–] [email protected] 23 points 3 days ago

He might be overqualified

[–] [email protected] 54 points 3 days ago (9 children)

This person is LARPing as a CS major on 4chan

It's not possible to write functional code without understanding it, even with ChatGPT's help.

load more comments (9 replies)
[–] [email protected] 113 points 3 days ago (3 children)

If it's the first course where they use Java, then one could easily learn it in 21 hours, with time for a full night's sleep. Unless there's no code completion and you have to write imports by hand. Then, you're fucked.

[–] [email protected] 132 points 3 days ago (4 children)

If there's no code completion, I can tell you even people who's been doing coding as a job for years aren't going to write it correctly from memory. Because we're not being paid to memorize this shit, we're being paid to solve problems optimally.

[–] [email protected] 36 points 3 days ago

Also get paid extra to not use java

load more comments (3 replies)
[–] [email protected] 34 points 3 days ago (3 children)

My first programming course (in Java) had a pen and paper exam. Minus points if you missed a bracket. :/

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 35 points 3 days ago* (last edited 3 days ago) (6 children)

isn't it kinda dumb to have coding exams that aren't open book? if you don't understand the material, on a well-designed test you'll run out of time even with access to the entire internet

when in the hell would you ever be coding IRL without access to language documentation and the internet? isn't the point of a class to prepare you for actual coding you'll be doing in the future?

disclaimer did not major in CS. but did have a lot of open book tests—failed when I should have failed because I didn't study enough, and passed when I should have passed because the familiarity with the material is what allows you to find your references fast enough to complete the test

load more comments (6 replies)
[–] [email protected] 75 points 3 days ago (22 children)

Why would you sign up to college to willfully learn nothing

[–] [email protected] 1 points 1 day ago

If you go through years of education, learn nothing, and all you get is a piece of paper, then you've just wasted thousands of hours and tens of thousands of dollars on a worthless document. You can go down to FedEx and print yourself a diploma on nice paper for a couple of bucks.

If you don't actually learn anything at college, you're quite literally robbing yourself.

[–] [email protected] 43 points 3 days ago* (last edited 3 days ago) (4 children)

My Java classes at uni:

Here's a piece of code that does nothing. Make it do nothing, but in compliance with this design pattern.

When I say it did nothing, I mean it had literally empty function bodies.

load more comments (4 replies)
load more comments (20 replies)
[–] [email protected] 91 points 3 days ago (1 children)

generate code, memorize how it works, explain it to profs like I know my shit.

ChatGPT was just his magic feather all along.

load more comments (1 replies)
[–] [email protected] 27 points 3 days ago* (last edited 3 days ago) (7 children)

pay for school

do anything to avoid actually learning

Why tho?

load more comments (7 replies)
[–] [email protected] 50 points 3 days ago (15 children)

I don't think you can memorize how code works enough to explain it and not learn codding.

load more comments (15 replies)
load more comments
view more: next ›