Showing posts with label faculty. Show all posts
Showing posts with label faculty. Show all posts

Sunday, April 26, 2026

To teach in the time of ChatGPT is to know pain; Ars Technica, April 13, 2026

SCOTT K. JOHNSON , Ars Technica; To teach in the time of ChatGPT is to know pain

"Let me explain why students are the ones losing the most in this environment and why instructors like me feel pretty much powerless to fix the problem.

Do or do not, there is no AI

Students often carry misconceptions about coursework. They may view an instructor as an opponent standing in the way of the grade they want. And they see “getting the right answers” as the goal of education because that’s how you secure that grade.

But that’s no more true than thinking that logging a count of reps is the goal of bodybuilding. The hard work of lifting weights is the point because that yields physical results. A popular analogy is that using an LLM to write your essay is like driving a forklift into the weight room. Weights get lifted, sure, but nothing is accomplished. I’m not hoping you can answer the exam question for me—I don’t need your essay to get me out of a jam. The process of doing the work was what you needed to walk away with something.

In a recent video about how easy Sora has made it for users to generate relatively realistic but deeply problematic videos, Hank Green rubbed his eyes as he shouted in the figurative direction of OpenAI CEO Sam Altman, “The friction matters, Sam!”...

I’m not alone in feeling exasperated by this predicament. A survey of about 3,000 college faculty showed that 85 percent felt LLMs “make students less likely to develop critical thinking abilities,” and 72 percent reported challenges managing LLM use.

Predictably, the response from higher education administrators―who are busy signing contracts for institutional LLM subscriptions to show how future-first their thought leadership is―has been to tell instructors that their job is to teach students “how to use AI effectively.”...

A few months ago, I overheard some college students talking about their classes. One was complaining about an assignment they needed to do that night, and another incredulously asked why they wouldn’t just have ChatGPT do it. The first replied, “This is my major, I actually need to learn stuff in this class. I use AI for my other classes.”"

Saturday, April 25, 2026

Q&A: In the age of AI, what is a library for?; UVAToday, April 15, 2026

 Alice Berry , UVAToday; Q&A: In the age of AI, what is a library for?

"Q. Where do you fall on the AI enthusiast to AI detractor spectrum?

A. A faculty member at another university asked me recently whether it was defensible to ban AI in her course. I said yes.

That probably isn’t what people expect from someone who spent the last three years building a framework for AI literacy. But it was the honest answer for now. She believed her students needed to develop a specific skill that AI use would short-circuit, and banning it was the right call for that course.

What I would ask of faculty who choose that path is to stay open, keep up with how the technology is developing, and be willing to try approaches others have tested. That is part of what the lab is for: to produce case studies that give faculty something real to work from when they are ready to revisit the question.

I’m wary of the two confident positions on AI in higher education right now: the people certain it will transform teaching, and the people certain it will destroy it. Both are getting ahead of what we actually know about what’s happening in our classrooms.

Q. What is the function of a library in this AI age?

A. A research library has always done two things: help people find information, and help them judge it. AI changes the tools, not the mission. If anything, the mission gets sharper. The library is also one of the few places in a university built to convene across disciplines, and AI literacy requires exactly that: technical knowledge, ethics, critical thinking, practical skill, and societal impact all at once. No single department owns that combination. 

A library can hold it together. That is why we are launching the AI Literacy and Action Lab here. Dean Acampora and I share the conviction that AI is an opportunity for the liberal arts, not a threat to them. The lab is built on that shared premise: AI literacy is a liberal arts problem as much as a technical one, and a university that treats it only as technical will get the answer wrong."

Thursday, February 26, 2026

Wednesday, February 4, 2026

Professors Are Being Watched: ‘We’ve Never Seen This Much Surveillance’; The New York Times, February 4, 2026

 , The New York Times; Professors Are Being Watched: ‘We’ve Never Seen This Much Surveillance’

Scrutiny of university classrooms is being formalized, with new laws requiring professors to post syllabuses and tip lines for students to complain.

"College professors once taught free from political interference, with mostly their students and colleagues privy to their lectures and book assignments. Now, they are being watched by state officials, senior administrators and students themselves."

Wednesday, October 29, 2025

Big Tech Makes Cal State Its A.I. Training Ground; The New York Times, October 26, 2025

, The New York Times ; Big Tech Makes Cal State Its A.I. Training Ground

"Cal State, the largest U.S. university system with 460,000 students, recently embarked on a public-private campaign — with corporate titans including Amazon, OpenAI and Nvidia — to position the school as the nation’s “first and largest A.I.-empowered” university. One central goal is to make generative A.I. tools, which can produce humanlike texts and images, available across the school’s 22 campuses. Cal State also wants to embed chatbots in teaching and learning, and prepare students for “increasingly A.I.-driven”careers.

As part of the effort, the university is paying OpenAI $16.9 million to provide ChatGPT Edu, the company’s tool for schools, to more than half a million students and staff — which OpenAI heralded as the world’s largest rollout of ChatGPT to date. Cal State also set up an A.I. committee, whose members include representatives from a dozen large tech companies, to help identify the skills California employers need and improve students’ career opportunities."

Saturday, August 23, 2025

PittGPT debuts today as private AI source for University; University Times, August 21, 2025

MARTY LEVINE, University Times; PittGPT debuts today as private AI source for University

"Today marks the rollout of PittGPT, Pitt’s own generative AI for staff and faculty — a service that will be able to use Pitt’s sensitive, internal data in isolation from the Internet because it works only for those logging in with their Pitt ID.

“We want to be able to use AI to improve the things that we do” in our Pitt work, said Dwight Helfrich, director of the Pitt enterprise initiatives team at Pitt Digital. That means securely adding Pitt’s private information to PittGPT, including Human Resources, payroll and student data. However, he explains, in PittGPT “you would only have access to data that you would have access to in your daily role” — in your specific Pitt job.

“Security is a key part of AI,” he said. “It is much more important in AI than in other tools we provide.” Using PittGPT — as opposed to the other AI services available to Pitt employees — means that any data submitted to it “stays in our environment and it is not used to train a free AI model.”

Helfrich also emphasizes that “you should get a very similar response to PittGPT as you would get with ChatGPT,” since PittGPT had access to “the best LLM’s on the market” — the large language models used to train AI.

Faculty, staff and students already have free access to such AI services as Google Gemini and Microsoft Copilot. And “any generative AI tool provides the ability to analyze data … and to rewrite things” that are still in early or incomplete drafts, Helfrich said.

“It can help take the burden off some of the work we have to do in our lives” and help us focus on the larger tasks that, so far, humans are better at undertaking, added Pitt Digital spokesperson Brady Lutsko. “When you are working with your own information, you can tell it what to include” — it won’t add misinformation from the internet or its own programming, as AI sometimes does. “If you have a draft, it will make your good work even better.”

“The human still needs to review and evaluate that this is useful and valuable,” Helfrich said of AI’s contribution to our work. “At this point we can say that there is nothing in AI that is 100 percent reliable.”

On the other hand, he said, “they’re making dramatic enhancements at a pace we’ve never seen in technology. … I’ve been in technology 30 years and I’ve never seen anything improve as quickly as AI.” In his own work, he said, “AI can help review code and provide test cases, reducing work time by 75 percent. You just have to look at it with some caution and just (verify) things.”

“Treat it like you’re having a conversation with someone you’ve just met,” Lutsko added. “You have some skepticism — you go back and do some fact checking.”

Lutsko emphasized that the University has guidance on Acceptable Use of Generative Artificial Intelligence Tools as well as a University-Approved GenAI Tools List.

Pitt’s list of approved generative AI tools includes Microsoft 365 Copilot Chat, which is available to all students, faculty and staff (as opposed to the version of Copilot built into Microsoft 365 apps, which is an add-on available to departments through Panther Express for $30 per month, per person); Google Gemini; and Google NotebookLMwhich Lutsko said “serves as a dedicated research assistant for precise analysis using user-provided documents.”

PittGPT joins that list today, Helfrich said.

Pitt also has been piloting Pitt AI Connect, a tool for researchers to integrate AI into software development (using an API, or application programming interface).

And Pitt also is already deploying the PantherAI chatbot, clickable from the bottom right of the Pitt Digital and Office of Human Resources homepages, which provides answers to common questions that may otherwise be deep within Pitt’s webpages. It will likely be offered on other Pitt websites in the future.

“Dive in and use it,” Helfrich said of PittGPT. “I see huge benefits from all of the generative AI tools we have. I’ve saved time and produced better results.”"

Thursday, July 10, 2025

An AI Ethics Roadmap Beyond Academic Integrity For Higher Education; Forbes, July 8, 2025

 Dr. Aviva Legatt,, Forbes; An AI Ethics Roadmap Beyond Academic Integrity For Higher Education

"Higher education institutions are rapidly embracing artificial intelligence, but often without a comprehensive strategic framework. According to the 2025 EDUCAUSE AI Landscape Study, 74% of institutions prioritized AI use for academic integrity alongside other core challenges like coursework (65%) and assessment (54%). At the same time, 68% of respondents say students use AI “somewhat more” or “a lot more” than faculty.

These data underscore a potential misalignment: Institutions recognize integrity as a top concern, but students are racing ahead with AI and faculty lack commensurate fluency. As a result, AI ethics debates are unfolding in classrooms with underprepared educators. “Faculty were expected to change their assignments overnight when generative AI hit,” said Jenny Maxwell, Head of Education at Grammarly. “We’re trying to meet institutions where they are—offering tools and guidance that support both academic integrity and student learning without adding more burden to educators.”

The necessity of integrating ethical considerations alongside AI tools in education is paramount. Employers have made it clear that ethical reasoning and responsible technology use are critical skills in today’s workforce. According to the Graduate Management Admission Council’s 2025 Corporate Recruiters Survey, these skills are increasingly vital for graduates, underscoring ethics as a competitive advantage rather than merely a supplemental skill. “Just because you think you’re an ethical person doesn’t mean you won’t inadvertently do harm if you’re working in machine learning without being trained and constantly aware of the risks,” said Liz Moran, Director of Academic Programs at SAS. “That’s why we’re launching an AI Foundations credential with a dedicated course on Responsible Innovation and Trustworthy AI. Students need the ethical reasoning to use those skills responsibly.”

Friday, September 6, 2024

AN ETHICS EXPERT’S PERSPECTIVE ON AI AND HIGHER ED; Pace University, September 3, 2024

 Johnni Medina, Pace University; AN ETHICS EXPERT’S PERSPECTIVE ON AI AND HIGHER ED

"As a scholar deeply immersed in both technology and philosophy, James Brusseau, PhD, has spent years unraveling the complex ethics of artificial intelligence (AI).

“As it happens, I was a physics major in college, so I've had an abiding interest in technology, but I finally decided to study philosophy,” Brusseau explains. “And I did not see much of an intersection between the scientific and my interest in philosophy until all of a sudden artificial intelligence landed in our midst with questions that are very philosophical.”.

Some of these questions are heavy, with Brusseau positing an example, “If a machine acts just like a person, does it become a person?” But AI’s implications extend far beyond the theoretical, especially when it comes to the impact on education, learning, and career outcomes. What role does AI play in higher education? Is it a tool that enhances learning, or does it risk undermining it? And how do universities prepare students for an AI-driven world?

In a conversation that spans these topics, Brusseau shares his insights on the place of AI in higher education, its benefits, its risks, and what the future holds...

I think that if AI alone is the professor, then the knowledge students get will be imperfect in the same vaguely definable way that AI art is imperfect."

Saturday, August 31, 2024

More Art School Classes Are Teaching AI This Fall Despite Ethical Concerns and Ongoing Lawsuits; Artnews, August 30, 2024

KAREN K. HO, Artnews ; More Art School Classes Are Teaching AI This Fall Despite Ethical Concerns and Ongoing Lawsuits


"When undergraduate students return to the Ringling College of Art and Design this fall, one of the school’s newest offerings will be an AI certificate

Ringling is just the latest of several top art schools to offer undergraduate students courses that focus on or integrate artificial intelligence tools and techniques.

ARTnews spoke to experts and faculty at Ringling, Rhode Island School of Design(RISD), Carnegie Mellon University (CMU), and Florida State University about how they construct curriculum; how they teach AI in consideration of its limitations and concerns about ethics and legal issues; as well as why they think it’s important for artists to learn."

Thursday, August 17, 2023

Local universities prepared to teach ethics of using generative AI; Rochester Business Journal, August 15, 2023

 Caurie Putnam, Rochester Business Journal; Local universities prepared to teach ethics of using generative AI

"How are local schools handling these platforms that have the potential to produce human-like AI-generated content like essays based on the input of the user? You may be surprised."

Tuesday, August 8, 2023

How book-banning campaigns have changed the lives and education of librarians – they now need to learn how to plan for safety and legally protect themselves; The Conversation, July 20, 2023

 Baker Endowed Chair and Professor of Library and Information Science, University of South Carolina, The Conversation ; ; How book-banning campaigns have changed the lives and education of librarians – they now need to learn how to plan for safety and legally protect themselves

"Library professionals maintain that books are what education scholar Rudine Sims Bishop called the “mirrors, windows and sliding glass doors” that allow readers to learn about themselves and others and gain empathy for those who are different from them. 

The drive to challenge, ban or censor books has not only changed the lives of librarians across the nation. It’s also changing the way librarians are now educated to enter the profession. As a library school educator, I hear the anecdotes, questions and concerns from library workers who are on the front lines of the current fight and are not sure how to react or respond. 

What once, and still is, a curriculum that includes book selection, program planning and serving diverse communities in the classroom, my faculty colleagues and I are now expanding to include discussions and resources on how students, once they become professional librarians, can physically, legally and financially protect themselves and their organizations."

Minnesota colleges grappling with ethics and potential benefits of ChatGPT; Star Tribune, August 6, 2023

 , Star Tribune ; Minnesota colleges grappling with ethics and potential benefits of ChatGPT

"While some Minnesota academics are concerned about students using ChatGPT to cheat, others are trying to figure out the best way to teach and use the tool in the classroom.

"The tricky thing about this is that you've got this single tool that can be used very much unethically in an educational setting," said Darin Ulness, a chemistry professor at Concordia College in Moorhead. "But at the same time, it can be such a valuable tool that we can't not use it.""

Monday, July 3, 2023

At UChicago, a Debate Over Free Speech and Cyber Bullying; The New York Times, July 3, 2023

Vimal Patel, The New York Times ; At UChicago, a Debate Over Free Speech and Cyber Bullying

"Mary Anne Franks, a University of Miami law professor who studies civil rights and technology, said that universities should pay more attention to the intimidation of faculty members.

Cyberbullying “is much more intentional, vicious and threatening to a person than someone shouting unpleasant things to a person during a talk,” she said, adding that Mr. Schmidt’s behavior “was very much calculated to generate exactly the reaction that it did.”"

Tuesday, March 7, 2023

Register for ‘Ethics, Institutional Review Boards and Scholarly Activities: Pitfalls and Parapets’; WV Mountaineer ENews, March 7, 2023

WV Mountaineer ENews; Register for ‘Ethics, Institutional Review Boards and Scholarly Activities: Pitfalls and Parapets’

"All faculty are invited to attend the WVU Health Sciences Center Faculty Development Program presentation “Ethics, Institutional Review Boards and Scholarly Activities: Pitfalls and Parapets” from noon to 1 p.m. on March 14.

The presenter is Steve Davis, associate professor in the Department of Health Policy, Management and Leadership.

To register by noon on March 13, contact HSCfacultydevelopment@hsc.wvu.edu. Make sure to include the date and title of this presentation in your email. 

Registration is required to receive the Zoom access code. Access information will be sent to participants the day prior to the session. Please do not share the Zoom code."

Wednesday, January 18, 2023

Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach; The New York Times, January 16, 2023

Kalley Huang, The New York Times ; Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach

"In higher education, colleges and universities have been reluctant to ban the A.I. tool because administrators doubt the move would be effective and they don’t want to infringe on academic freedom. That means the way people teach is changing instead."

Saturday, December 17, 2022

Ethics Hotline available to Ohio University employees; Ohio University, Ohio News, December 5, 2022

Ohio University, Ohio News; Ethics Hotline available to Ohio University employees

"The following message was shared with Ohio University faculty and staff on Monday, Dec. 5, 2022

Dear Ohio University employees,

Promoting a responsible and ethical workplace is everyone’s responsibility. For more than 16 years, Ohio University has demonstrated its commitment to this principle by contracting with EthicsPoint® to provide the Ohio University Ethics Hotline. The hotline can be used to report concerns of fraud, waste, abuse, or non-compliance with regulations or University policies—anonymously if so desired. This process is managed by the Office of Audit, Risk, and Compliance, and additional information can be found on the Office’s website(opens in a new window)

Members of the University community may submit an anonymous report one of two ways: through a toll-free number or through a web-intake process on any computer or mobile device. Reports made to the hotline—via phone or website—are triaged and responded to by anonymous dialogue between the reporter, Audit, Legal Affairs, or the University representative who can most appropriately respond to the concern.

The University encourages employees to report concerns through normal lines of communication, such as to a supervisor or to an office or individual whose responsibility it is to handle such reports. However, when employees are uncomfortable doing so, the hotline offers an alternative for filing concerns anonymously. The University prohibits retaliation against an individual who in good faith reports concerns or provides information about suspected University-related misconduct, whether reported through normal channels or through the hotline. 

If you have concerns about possible fraud, waste, abuse of University assets, or other compliance or regulatory issues, you can file a report from any computer or mobile device on the Ohio University Ethics Hotline(opens in a new window), or by calling EthicsPoint toll-free at (866) 294-9591. 

While investigations are conducted in a highly confidential manner, it should be noted that records generated during an investigation may be subject to disclosure in accordance with applicable laws, including Ohio’s Public Records Act. The University is also required by Ohio law to make the University community aware of an additional fraud hotline maintained by the Ohio Auditor of State. This additional hotline resource is available by calling (866) 372-8364. 

Thank you for doing your part in creating an open and ethical culture here at Ohio University. 

Marion L. Candrea
Chief Audit Executive"

Monday, May 16, 2022

Texas A&M Weighs Sweeping Changes to Library; Inside Higher Ed, May 16, 2022

Josh Moody, Inside Higher Ed; Texas A&M Weighs Sweeping Changes to Library

"The Texas A&M University system is working on a plan that would make sweeping changes across its 10 libraries. Those changes, still being discussed, would include asking librarians to relinquish tenure or transfer to another academic department to keep it.

The plan grew out of recommendations from MGT Consulting, which Texas A&M hired in June 2021 “to conduct a high-level, comprehensive review of major functional areas,” according to a company report. But as administrators have suggested additional changes, including to employee classification, faculty members have pushed back, arguing that proposed structural changes to the library system will do more harm than good.

They are especially concerned about a proposal that would end tenure for librarians. Experts note that tenure for librarians, which is somewhat common in academia, though not universal, can be crucial for academic freedom, especially in a political environment in which librarians are under fire."