• 0 Posts
  • 206 Comments
Joined 2 years ago
cake
Cake day: June 18th, 2023

help-circle











  • There is plenty wrong with generative AI as a tool if you think of it in those terms.

    I would say that if the depth of analysis is limited to “AI” or “genAI” then use of it in schools is overwhelmingly bad. If that’s the limit of our ability to frame the issue, then banning AI would appear inevitable, and any graded assignment that might encourage AI use should be banned.

    But if you want to break things down, you can find specific tools (i.e., calculators, grammar checkers) that could be labeled as AI or specific uses of genAI (i.e., brainstorming) that have use. And it is this latter approach – clearly identifying positive uses – that is difficult for students, media writers, and apparently policy makers to do.


  • Yes and no. Remember that rich kids could always hire ghost writers. ChatGPT made that available to the masses, but that particular problem goes back centuries.

    What we have seen is that the curriculum is often decided by a distant committee who actually doesn’t understand life on the ground. In reality, there are easy ways for teachers to undercut the utility of ChatGPT, if they have the freedom to make changes. But that depends on teachers having control and the time to make changes to how they teach.



  • The funny thing is that if AI coding were that good, we would already see widespread adoption in open source projects. But we haven’t, because it sucks. Of course commercial software development companies are free to lie about how much they use AI, or get creative with their metrics so they can get their KPI bonuses. So we can’t really believe anything they say. But we can believe in transparency.

    As always, there are so many people selling snake oil by saying the word AI without actually telling you what they mean. Quite obviously there are a great many tools that one could call AI that can be and are and have been used to help do a ton of things, with many of those technologies going back decades. That’s different from using ChatGPT to write your project. Whenever you hear someone write about AI and not give clear definitions, there’s a good chance they’re full of s***.



  • I’ve been in your position and in the other person’s position many times. It can be frustrating but we need to think about the big picture. It’s possible you hadn’t considered a certain approach, and it’s probable that many other future readers will not have considered a certain approach. So even though you might have said that you want to do something specific, it’s often helpful to some people to provide general information of another way to tackle the same issue.

    And of course you know your own situation, so now there are these comments that appear off topic, and they kind of are, for you, and that’s just how it is on forums.

    The other situation that comes up a lot is that people are doing it wrong. They are misusing some piece of technology and while their kluge might kind of work right now, it’s setting themselves up for bigger issues in the future. Of course no one appreciates it when you tell them they’re doing it wrong.




  • Except what you’re describing doesn’t make sense. If the new owners purchased all of those things, then in reality they purchased the company. Courts are very likely to agree on this. It looks like a company-wide sale, therefore it probably is, even if someone tries to add a line saying “we aren’t liable”.

    But imagine someone could “sell everything other than the liability”. In such a case, the seller would be putting themselves on the hook to pay outstanding debts (i.e., the seller would be liable). And we know they have money – they just sold the thing. So then the seller would pay… But they know that in advance, so they would not agree to such a sale in the first place, unless they were planning to steal that money through creative accounting of some kind… But both parties know all of that that in advance, so they would both be acting fraudulently.