• Welcome to OGBoards 10.0, keep in mind that we will be making LOTS of changes to smooth out the experience here and make it as close as possible functionally to the old software, but feel free to drop suggestions or requests in the Tech Support subforum!

CTYB8.0: broke boards are broke (alternate title: Meck Dec Day? More like Meck Meh Day)

Status
Not open for further replies.
I'll be the dummy butting into the intellectual conversation, but as the parent of a kid (hopefully) entering college next year, I'm finding the discussion interesting.

I hope my kid embraces those experiences that shape how she thinks rather than viewing the assignment as checking a box on the way to her credential that is required for getting whatever job she wants in the future. I also hope that she recognizes that if she can put forth just a little more effort than the AI zombies in her classes, she can end up with a grade that is better than "just good enough". AND YET, being adept at using the tools of AI will be a requirement for just about any career of the future, I think.

To what extent are you incorporating how best to use AI tools as a part of what you're teaching, if at all?
it's an incomplete answer, but i told my last public speaking students they could use gpt as much as they wanted for generating speech material, then contextualized that GPT generates content that is a synthesis of mountains of data (we looked at some examples as a hypothetical student using GPT to generate a speech) - but i'm way more interested in (and thus will be evaluating them on) how you personally approach each topic you're speaking about, for the speaker and the audience - something that I guess GPT could generate (esp. the latter), but they'd have to spend a shit load of time prompting it I guess

I also shifted my rubric that year to compensate for greater emphasis on delivery which is unique for me

for me it's a win because the composition portion of that class was always super lame - the text book was already a "condensed" approach to public speaking and I already didn't have time to work v closely on both composition and delivery so I wasn't super torn about throwing one to the wolves
 
for yr kid, just help them to think critically about the tool

both how can it be used to help and what are they sacrificing when they use the tool and what are its limitations?
 
To what extent are you incorporating how best to use AI tools as a part of what you're teaching, if at all?
I am not.

Generative AI, in particular, skips all the steps that my discipline finds valuable in actually learning and doing

I can see how there'd be value in fields that use, for example, predictive modeling or code-generation, or marketing research. But much of what I teach is not outcomes-based, even if higher education, as a whole, seems to be.

(Everything you said sounds thoughtful and sensible. But I do follow this stuff pretty closely and I'm yet to read or see examples of such lessons for undergraduates that I think are truly valuable for learning or professionalization. I'm sure they're out there and I'd welcome them, in theory, but I think my time and energy is best spent on other things. For now, at least)
 
because only one in ten human beings is smart or industrious enough to proofread and revise the nonsense it spits out
Seems like we should focus on teaching the human being how to edit and revise the nonsense rather than fight the generative tool.

Or...maybe we could look forward to a future when AI writes our stupid TPS reports and we humans have more time for things like writing poetry or bird watching.
 
I agree with everything you said, DiV, except for this. I read every article in the Chronicle of Higher Education about the importance of incorporating ai tools into pedagogy and go to every lecture promising a future where ai will be a basic component of every job everywhere.

I agree with you that various forms of ai will be integral to many, if not most careers. But in my opinion (and I'm certainly no expert on ai), the complexity of the particular ai tools and the level of human training required to deploy them with basic competency that most careers of the (near) future will require is relatively low.

I know I'm a vocal advocate for humanities education in a time when many people simply do not care and would prefer to focus on metrics like speed and efficiency, but I see the most desirable job candidates of the future as those who can operate thoughtfully and critically without technologies that artificially augment human intelligence. They can figure out any requisite tools when they get there
This makes sense. It does remind me that I finished college around the time the World Wide Web became a thing. Maybe the growth of AI applications will allow us to focus our energies/education on those things that make us most human/ can't be replicated by 'puters. As technology has advanced, it has enabled me to spend less time on the stuff that was never really a good use of my time to begin with.

Or, what birdman just said.
 
Or...maybe we could look forward to a future when AI writes our stupid TPS reports and we humans have more time because most of us will have been laid off

FIFY, sadly. That’s how the adoption of automation has gone
 
if capital extraction is the goal the boss pockets the increase in productivity and forces you to do more with less so yeah that tracks
 
This makes sense. It does remind me that I finished college around the time the World Wide Web became a thing. Maybe the growth of AI applications will allow us to focus our energies/education on those things that make us most human/ can't be replicated by 'puters. As technology has advanced, it has enabled me to spend less time on the stuff that was never really a good use of my time to begin with.

Or, what birdman just said.
Everyone is just gonna be in sales, I think. A computer can't take me to the strip club.
 
in my view, students use generative ai because it's free, easy, and most teachers aren't good at identifying it. it saves them time and effort and, for now at least, will probably get them a grade that's just good enough.

And while I think thoughtful teaching and course-design can mitigate the problem, I don't think it's a "lecture-heavy / archaic information banking model of education" behind students' attraction to it. I agree with you that some conservative places/people still teach a "model of composition/rhetoric that emphasizes dispositio", but like I said in my post above many places are moving away from that model.

Also, to be clear, I'm not just concerned with first-year writing. This is a problem across higher education. And education in general
?!?!

 
if capital extraction is the goal the boss pockets the increase in productivity and forces you to do more with less so yeah that tracks

so far the only actual humans who have told me AI is awesome are the people selling AI as a product or service
 
Seems like we should focus on teaching the human being how to edit and revise the nonsense rather than fight the generative tool.

Or...maybe we could look forward to a future when AI writes our stupid TPS reports and we humans have more time for things like writing poetry or bird watching.
So there's something to that, of course, but the problem (again, for now) is that the information being produced is generic, biased, incomplete, and not infrequently completely fabricated.

Are we willing to give up for the sake of efficiency the actual generation of new knowledge? I'm certain that sooner rather than later generative AI will be much better at what it does and be able to draw on better and more accurate information -- to produce, like in your example, useful reports that save us time and open up space for more human stuff like poetry or birds. (Although I'm sad to remind you that people said the same thing about crop rotation, the cotton gin, industrialization, computer technology, the advent of the internet, the introduction of smartphones -- and the extent to which these innovations have improved the lives of regular people, of the workers themselves, is not totally clear to me)

I'm not fighting the technology like a luddite but simply resisting the way it's currently being used and critiqued. I understand my job, in part, to be about the ethics and implications of outsourcing human ingenuity, creativity, and emotion to a machine that will (for now) do only what we tell it to do, which is where individuality and personality and joy are reduced to the lowest common denominator.
 
Seems like we should focus on teaching the human being how to edit and revise the nonsense rather than fight the generative tool.

Or...maybe we could look forward to a future when AI writes our stupid TPS reports and we humans have more time for things like writing poetry or bird watching.
1715706598994.png
 
The main way I use ChatGPT nowadays is to clean up up my words. For example, I'm working with a colleague to retool an annual on-campus conference. I had a bunch of ideas in my head that would have taken a long time to write out and organize into themes, etc. Instead, I spent a few minutes dictating my thoughts into Word and then copy and pasted them into ChatGPT and told it to clean up my text up and rewrite it to email a colleague. Then I edited that and sent it.
 
Status
Not open for further replies.
Back
Top