• 0 Posts
  • 1.04K Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle

  • Mostly backend for web development (django most recently), though I’m moderately proficient at react and JavaScript for frontend. Also testing and QA.

    I’ve been applying to all of full stack, backend web dev, and test/QA. No bites. Learning more languages might help, but I feel like having 0 years experience in (say) Rust won’t get a lot of traction.

    Maybe it’s imposter syndrome but I feel like I’ve always been a sort of middle of the road engineer. Good at some things , bad at others. It feels like with all the layoffs and AI, there’s less room for broadly competent people. There’s just going to be the top tier, fighting and getting paid less.



  • Job market is brutal, at least for tech. Every job gets 100s of applicants. There’s a lot of AI slop. Offered salaries are down. For some reason, management wants people to go into the office (which is among other problems a pay cut compared to WFH).

    I’ve been unemployed since the winter. Had a handful of phone screens. Haven’t made it to a technical round yet.

    My old job laid off all but one guy and a contractor.

    Honestly, I kind of want to get out of tech but I don’t know what I can do that doesn’t require like a degree or is terrible.

    Unemployment runs out soon. Not that the pittance the state gives is enough to live on. I asked what I should do when it runs out and they were like, awkward shrug.

    Meanwhile there are billionaires living content lives of luxury.













  • It is absolutely stupid, stupid to the tune of “you shouldn’t be a decision maker”, to think an LLM is a better use for “getting a quick intro to an unfamiliar topic” than reading an actual intro on an unfamiliar topic. For most topics, wikipedia is right there, complete with sources. For obscure things, an LLM is just going to lie to you.

    As for “looking up facts when you have trouble remembering it”, using the lie machine is a terrible idea. It’s going to say something plausible, and you tautologically are not in a position to verify it. And, as above, you’d be better off finding a reputable source. If I type in “how do i strip whitespace in python?” an LLM could very well say “it’s your_string.strip()”. That’s wrong. Just send me to the fucking official docs.

    There are probably edge or special cases, but for general search on the web? LLMs are worse than search.