because of AI
Oh look, a bullshit article.
You need to learn the fundamentals of how things work, and how to apply those fundamentals, not rote specifics of a particular technology.
Submitted 1 year ago by cyu@sh.itjust.works to technology@lemmy.world
https://www.businessinsider.com/indeed-ceo-ai-chatgpt-could-make-college-skills-obsolete-2023-9
because of AI
Oh look, a bullshit article.
You need to learn the fundamentals of how things work, and how to apply those fundamentals, not rote specifics of a particular technology.
I didn’t know the laws of physics were getting obselete. God dammit
Man the medical are going flip when uterus 2.0 goes live
Crypto and AI have rewritten the book on the laws of physics. Now you can defy gravity with AI!
Tell that to 2/3rds of JS/Node devs 😬
Fundamentals,what are those?
In my university we learned that we should learn to learn.
It has proven to be very useful.
I got an Electronics Engineering Degree almost 30 years ago (and, funnilly enough, I then made my career as a software engineer and I barelly used the EE stuff professionally except when I was working in high-performance computing, but that’s a side note) and back then one of my software development teachers told us “Every 5 years half of what you know becomes worthless”.
All these years later, from my experience he was maybe being a little optimist.
Programming is something you can learn by yourself (by the time I went to Uni I was already coding stuff in Assembly purelly for fun and whilst EE did have programming classes they added to maybe 5% of the whole thing, though nowadays with embedded systems its probably more), but the most important things that Uni taught me were the foundational knowledge (maths, principles of engineering, principles of design) and how to learn, and those have served me well to keep up with the whole loss of relevance of half I knew every 5 years, sometimes in unexpect ways like how obscure details of microprocessor design I learned there being usefull when designing high performance systems, the rationalle for past designs speeding up my learning of new things purelly because why stuff is done how its done is still the same and even Trignometry decades later turning out to be essential for what I do now, game development.
So even in a fast changing field like Software Engineering a Degree does make a huge difference, not from memorizing bleeding edge knowledge but from the foundational knowledge you get, the understanding of the thought processes behind designing things and the learning to learn.
Also a software engineer… I look back on my undergrad fondly but was it really that helpful? … nah.
I also put no stock in learning how to learn. If people want to learn something they do, if they don’t, they don’t. Nobody has to go to school to fish, play video games, or be a car guy, but all of those things have crazy high ceilings of knowledge and know how.
If you go into an industry you’re not interested in, it doesn’t matter how well you learned to learn, you’re not going to learn anything more than required. For me, I’m constantly learning things from blogs, debates, and questions I find myself asking both for personal projects and professional projects.
Really all a university is, is a guided study of what’s believed to be the foundational material in a field + study of a number of things that are aimed at increasing awareness across the board; that’s going to be more helpful to some than others.
If you graduate and work in a bunch of Python web code … those fundamentals aren’t really that important. You’re not going to write quick sort of bubble sort, very few people do, you’re going to just call .sort()
.
You’re also probably not going to care about Big-O, you’re probably just going to notice organically “hey this is really bad and I can rearrange it to cache the results.” A bunch of stuff like that will probably come up that you’ll never even pay any mind to because the size of N is never large enough for it to matter in your application.
… personally I think our education system needs to be redone from the ground up. It creates way more stress than it justifies. The focus should be on teaching people important lessons that they can actually remember into adulthood, not cramming brains with an impossible amount of very specific information under the threat of otherwise living a “subpar” life.
Older societies I think had it right with their story form lessons, songs, etc. They made the important lessons cultural pieces and latched on to techniques that actually help people remember instead of just giving them the information with a technique to remember it and then being surprised when a huge portion of the class can’t remember.
this, and also nothing is 100% new - knowledge in similar areas will always help
Mostly bullshit because the ultimate goal of college isn’t to make you term basic facts which you need to graduate, instead the ultimate goal is to teach you how and where to learn about new developments in your field or where to look up Information which you don’t know or don’t remember.
Yes, but the how and where to learn are changing too, which is the problem
My answer is always in books. I don’t really know what changed.
Yeah, when I was in school, concerning hi-tech&tech we learned stuff that were already obsolete.
This has to be the stupidest AI take yet.
Was learning to do math made “obsolete” by calculators?
One thing I found especially dumb is this:
Jobs that require driving skills, like truck and taxi drivers, as well as jobs in the sanitation and beauty industries, are least likely to be exposed to AI, the Indeed research said.
Let’s ignore the dumb shit Tesla is doing. We already see self-driving taxis on the streets. California allows self-driving trucks already, and truck drivers are worried enough to petition California to stop it.
Both of those involve AI - just not generative AI. What kind of so-called “research” has declared 2 jobs “safe” that definitely aren’t?
I mean, to some degree … yes. Day to day, I do very little math … if it’s trivial I do it in my head if it’s more than a few digits, I just ask a calculator… because I always have one and it’s not going to forget to carry the 1 or w/e.
Long division I’ve totally forgotten.
Basic algebra, yeah I still use that.
Trig? Nah. Calc? Nah.
You’re not going to college level math to do basic calculations. You’re going to college level math because you need to learn how to actually fully understand and apply mathematical concepts.
So you’re still doing math then. And using a calculator as a tool to assist you.
Just like we’ll be doing with AI.
Learning professional skills? In college? My guy, that’s not what it’s about. Especially at universities, it’s not about learning professional skills as much as it is networking and earning a piece of paper that proves you can commit to something.
E.g. my university was still teaching the 1998 version of C++ in the late 2010’s. First job I had out of school used C++ 17. Was I fucked? No, because the languages I learned were far less important than how I learned to learn them.
I think the real issue is with schooling before college, and this article seems to be looking at college as the same sort of environment as the previous 12 years of school, which it isn’t. So much of everything through high school has become about putting pressure on teachers to hit good grades and graduating student percentages that actually teaching kids how to learn and how to collaborate with others has become a tertiary goal to simply having them regurgitate information on the tests to hit those 2 metrics.
I have taught myself a number of things on a wide range of subjects (from art to 3d printing to car maintenance and more. City planning and architecture are my current subjects of interest) and I’ve always said when people ask about learning all this stuff that I love to learn new things, despite the school system trying to beat it out of me. I dropped out of college despite loving my teachers and the college itself both because I didn’t like my major (the school was more like a trade school, we chose our majors before we even got to the college) and because I had never learned how to learn in the previous 12 years of school. I learned how to hold information just long enough to spit it out on the test and then forget it for the next set for the next test. Actually learning how to find information and internalize it through experience came after I left school.
Joke's on you, the stuff that my college tried to teach me was obsolete a decade before I was even born thanks to tenured professors who never updated their curriculum. Thank fuck I live in the Internet age.
Claiming modern day students face an unprecedentedly tumultuous technological environment only shows a bad grasp of history. LLMs are cool and all, but just think about the postwar period where you got the first semiconductor devices, jet travel, mass use of antibiotics, container shipping, etc etc all within a few years. Economists have argued that the pace of technological progress, if anything, has slowed over time.
I don’t think that latter statement is right,and if you’ve got some papers I’d love to read them. I’ve never heard an economist argue that. I have heard them argue that productivity improvement is declining despite technological growth though, more that it’s decoupling from underlying technology.
Robert Gordon and Tyler Cowen are two economists who have written about the topic. Gordon’s writings have been based on a very long and careful analysis, and has influenced and been cited by people like Paul Krugman. Cowen’s stuff is aimed at a more non-academic audience. You should be able to use that as a starting point for your search.
GPT is not equipped to care about whether the things it says are true. It does not have the ability to form hypotheses and test them against the real world. All it can do is read online books and Wikipedia faster than you can, and try to predict what text another writer would have written in answer to your question.
If you want to know how to raise chickens, it can give you a summary of texts on the subject. However, it cannot convey to you an intuitive understanding of how your chickens are doing. It cannot evaluate for you whether your chicken coop is adequate to keep your local foxes or cats from attacking your hens.
Moreover, it cannot convey to you all the tacit knowledge that a person with a dozen years of experience raising chickens will have. Tacit knowledge, by definition, is not written down; and so it is not accessible to a text transformer.
And even more so, it cannot convey the wisdom or judgment that tells you when you need help.
This is such a shit and out of touch article, OP. Why bring us this crap?
In theory, you go to college to learn how to think about really hard ideas and master really hard concepts, to argue for them honestly, to learn how to critically evaluate ideas.
Trade schools and apprenticeships are where you want to go if you want to be taught a corpus of immediately useful skills.
It’s the difference between sex education and sex training.
This has arguably always been the case. A century ago, it could take years to get something published and into a book form such that it could be taught, and even then it could take an expert to interpret it to a layperson.
Today, the expert can not only share their research, they can do interviews and make tiktok videos about a topic before their research has been published. If it's valuable, 500 news outlets will write clickbait, and students can do a report on it within a week of it happening.
A decent education isn't about teaching you the specifics of some process or even necessarily the state-of-the-art, it's about teaching you how to learn and adapt. How to deal with people to get things accomplished. How to find and validate resources to learn something. Great professors at research institutions will teach you not only the state-of-the-art, but the opportunities for 10 years into the future because they know what the important questions are.
I mean, that's really only true for compsci. While scientific and technological advances will indeed be made in STEM in general, they aren't that fast or significant enough to make what was learned unviable.
Not even though. The things I learned about in my bachelor’s and master’s didn’t suddenly get mase obsolete.
I’d like to see the innovation that makes algorithm theory obsolete.
Fair. I was thinking more about changes in coding language usage, but I suppose that also depended on when you were attending university. There have been periods where things changed faster in compsci than other periods.
The basic algorithms and mathematics are still the same tho, maybe the implementations are going to be different on 5 years from today, but there’s not going to be a revolution on mathematics in 5 years that makes the teaching of calculus useless.
What do you mean by ‘obsolete’?
This is the best summary I could come up with:
In an essay, Hyams shared his top concerns around AI — one of which is how technologies like OpenAI’s ChatGPT will affect the job market.
“With AI, it’s conceivable that students might now find themselves learning skills in college that are obsolete by the time they graduate,” Hyams wrote in the essay.
“The higher the likelihood that a job can be done remotely, the greater its potential exposure is to GenAI-driven change,” the researchers wrote, referring to generative artificial intelligence.
The CEOs thoughts on AI come as labor experts and white-collar workers alike become increasingly worried that powerful tools like ChatGPT may one day replace jobs.
After all, employees across industries have been using ChatGPT to develop code, write real estate listings, and generate lesson plans.
For instance, Hyams said that Indeed’s AI technology, which recommends opportunities to its site visitors, helps people get hired “every three seconds.”
The original article contains 463 words, the summary contains 148 words. Saved 68%. I’m a bot and I’m open source!
tony@lemmy.hoyle.me.uk 1 year ago
I was taught how punch cards work and that databases used direct disk access. In 1990.
In college (1995) we learned Cobol and Assembler. And Pre-Object oriented Ada (closer to early pascal than anything I can see on wiki today). C was the ‘new thing’ that was on the machines but we weren’t allowed to use.
The curriculum has always been 20 years behind reality, especially in tech. Lecturers teach what they learned, not what is current. If you want to keep up you teach yourself.
glad_cat@lemmy.sdf.org 1 year ago
I learned how “object-oriented databases” work in college. After 20 years of work, I still don’t know if such a thing exists at all. I read books regularly instead.
tony@lemmy.hoyle.me.uk 1 year ago
Wiki says they existed, and may still do… never come across one. I thought mongodb might be one but apparently not.
Paradox@lemdro.id 1 year ago
I’ve used one before. Maglev is a ruby runtime built atop GemStone/S, which is an object db. Gives Ruby some distributed powers, like BEAM languages (Elixir and Erlang) have.
Practically all it meant was you didn’t have to worry about serializing ruby objects to store them in your datastore, and they could be distributed across many systems. You didn’t have to use message buses and the like. It worked, but not as well as you’d hope.
Amusingly, BEAM languages, have access to tools a lot like oodbmses right out of the box. ets, dets, and mnesia loosely fit the definition of an oodb. BEAM is functional and doesn’t have objects at all, so the comparisons can be a tad strained.
Postgres also loosely satisfies the definition, with jsonb columns having first class query support.
PlexSheep@feddit.de 1 year ago
Im currently studying Cybersecurity and I can speak positively about that. We’re taught C and Java in the programming course (java is still ew, but C is everywhere and will be everywhere). I know a course of two friends got taught Rust (I learned it at work, it’s great).
The crypto we learn is current stuff, except no EdDSA or Post Quantum stuff.
Davin@lemmy.world 1 year ago
I had to take a COBOL class in early 2000s. And one of the two C/C++ courses was 90% talking about programming and taking quizzes about data types and what do functions do, and 10% making things just beyond “hello world.” And I’m still paying the student loans.
beigeoat@110010.win 1 year ago
A course in college had an assignment which required Ada, this was 3 years ago.
CoderKat@lemm.ee 1 year ago
If it was something like a language theory class, that’s perfectly valid. Honestly, university should be teaching heavily about various language paradigms and less about specific languages. Learning languages is easy if you know a similar language already. And you will always have to do it. For my past jobs, I’ve had to learn Scala, C#, Go, and several domain specific or niche languages. All of them were easy to learn because my university taught my the general concepts and similar languages.
The most debatable language I ever learned in university was Prolog. For so long, I questioned if I would ever have a practical usage for that, but then I actually did, because I had to use Rego for my work.
scarabic@lemmy.world 1 year ago
What everyone would LIKE to learn is the exact skill that’s going to be rare and in high demand the second right after you graduate. But usually what’s rare and in high demand is also new, and there are no qualified teachers for it. Anyone who knows how to do it is making bank doing it just like all the college grads want to do. My advice is to get out of college and then spend the next four working years learning as much as you can. You’re not going to hit the jackpot as a recent grad. You’re maybe going to get in the door as a recent grad.