• 2 Posts
  • 500 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle

  • Good question! I can’t remember.

    I think I read a Microsoft blog or something like a decade ago that said they shifted from a Hyper-V based solution to Linux to improve stability, but honestly it’s been so long I wouldn’t be shocked if I just saw it in a reddit comment on a related article that I didn’t yet have the technical knowhow to fully comprehend and took it as gospel.








  • I just accepted a job with a small MSP starting early next year. I kept a close ear out during the interview for signs of the classic MSP hell stuff that would chew through techs but it does look like I got a good one (small 8 or so man shop) but check in in about 3 months and we’ll see how I’m feeling haha

    My longer term plan is to use this as a stepping stone to then move onto being in-house then figuring out my exit strategy before burnout takes me, which I’m thinking I’ll either be aiming to move into IT management or possibly moving into a business analytics or cloud administration type role. Technical sales probably wouldn’t be too bad either.






  • He’s not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn’t because the market has demanded it or some sort of conspiracy, but literally we can’t get faster without chips getting smaller and closer together.

    While I agree with you on a technical level, I read it as Pat Gelsinger intends to stop development of discrete graphics cards after Battlemage, which is disappointing but not surprising. Intel’s GPUs while incredibly impressive simply have an uphill battle for desktop users and particularly gamers to ensure every game a user wishes to run can generally run without compatibility problems.

    Ideally Intel would keep their GPU department going because they have a fighting chance at holding a significant market share now that they’re past the hardest hurdles, but they’re in a hard spot financially so I can’t be surprised if they’re forced to divest from discrete GPUs entirely


  • Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

    I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn’t sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It’s got a Ryzen 2600 by memory that’s horribly thermally limited and because of that it leaves so much performance on the table)



  • I have to disagree. When I tried out a VR headset at a con I spent 2 hours with the headset on in Space Pirate Training Simulator thinking it had only been 20 minutes. This was the $250 Meta Quest 2 while I had a heavy backpack on my back because I didn’t have anyone with me to leave my bag with. I was trying to be conscious with not taking too much time with the headset so others could have a chance and figured about 15-20 minutes would be appropriate but apparently I was completely in the zone!

    I can count on one hand how many times I’ve had that much of a time traveling game experience, so I’d say VR is a pretty dang cool experience and once hardware costs come down (or headsets become more ubiquitous) it’ll probably be a pretty big market for gamers, much like how consoles are now