Close Menu
  • Home
  • Kenya News
  • World News
  • Politics
  • Business
  • Opinion
  • Columnists
  • Entertainment
  • Sports
    • Football
    • Athletics
    • Rugby
    • Golf
  • Lifestyle & Travel
    • Travel
  • Gossip
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
News CentralNews Central
Subscribe
  • Home
  • Kenya News
  • World News
  • Politics
  • Business
  • Opinion
  • Columnists
  • Entertainment
  • Sports
    1. Football
    2. Athletics
    3. Rugby
    4. Golf
    5. View All

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026
  • Lifestyle & Travel
    1. Travel
    2. View All

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026

    A litany of theft yet no one accounts for it

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    Menengai Oilers bag Nakuru derby bragging rights once again

    January 17, 2026

    AFC Leopards fight back to return to SPL summit

    January 17, 2026
  • Gossip
News CentralNews Central
Home»Opinion»The rise of AI and threat to the future of real science
Opinion

The rise of AI and threat to the future of real science

By By Dr Xavier MusonyeJanuary 17, 2026No Comments6 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram Reddit WhatsApp
The rise of AI and threat to the future of real science
Share
Facebook Twitter Pinterest Email Copy Link LinkedIn Tumblr Reddit VKontakte Telegram WhatsApp
ChatGPT. [GettyImages]

Throughout the history of civilisation, technological evolution has largely improved human life. From the control of fire and invention of the wheel to steam engine and computer mainframes, each breakthrough has propelled society forward. Yet every leap has been accompanied by anxiety. 

Literature captured these fears memorably through Mary Shelley’s Frankenstein, where a human-made creation escapes its maker’s control. As artificial intelligence rapidly evolves, similar unease is resurfacing particularly within scientific and academic inquiry.

In what many nostalgically call the “golden era” of scholarship, research was slow, deliberate, and deeply human. Each source demanded careful reading to understand its methodology, conclusions, and the unanswered questions it left behind. The internet later simplified access, but comprehension still required effort and intellectual engagement.

Digital tools played a supportive role, helping with grammar or clarity rather than content. Knowledge advanced through discipline, not shortcuts. That rhythm changed dramatically in 2022, when AI text generators such as ChatGPT became accessible.

Follow The Standard
channel
on WhatsApp

Research, once measured in months or years, could now be produced in minutes with social media causing an “invasion of idiots into public spaces,” academia began facing a different invasion, one of unethical practices enabled by AI. Literature reviews, analyses, and even full papers could be machine-generated, raising profound ethical and technical concerns about the future of science. AI has made academic misconduct easier, faster, and harder to detect. Deep-learning systems can fabricate datasets, manipulate images, and generate experimental results that appear plausible at first glance.

While some AI-generated or falsified studies have been retracted, they are often discovered only through painstaking peer review or whistleblowing. This reality risks turning peer reviewers into forensic investigators, diverting time and resources from evaluating scientific merit to policing authenticity. Text generation poses an equally serious challenge. AI-produced papers may look original but often amount to algorithmic rearrangements of existing work. 

Worse still, AI tools frequently invent references nonexistent articles, misattributed authors, or incorrect titles. Such pseudo-scholarship undermines intellectual property, clogs academic journals, and erodes trust in scientific publishing. The difficulty of identifying AI-generated text further compounds the problem, allowing questionable research to circulate unnoticed.

In higher education, the implications are just as troubling where AI has intensified the “technological arms race” between cheating methods and detection systems. Students and even academics can submit work without conducting genuine research, blurring the line between assistance and deception. Misusing AI effectively becomes a form of misattributed authorship—claiming credit for work not personally done. 

While AI can also aid in preventing cheating through tools like watermarking or identity verification, its ethical use must be clearly taught and enforced. No chatbot can conduct laboratory experiments, test materials, or observe biological processes in real-world conditions. 

AI itself is not the enemy, used responsibly, it can enhance data analysis, language editing, literature searches, and discovery. The danger lies in treating it as a substitute for the scientific method rather than a complement to it. Science advances through curiosity, rigour, and engagement with reality—qualities no algorithm can replicate. As academia navigates this new era, it must reaffirm the values of integrity, proper attribution, and hands-on inquiry.

-The writer is a researcher (Energy Systems)

Follow The Standard
channel
on WhatsApp

Throughout the history of civilisation, technological evolution has largely improved human life. From the control of fire and invention of the wheel to steam engine and computer mainframes, each breakthrough has propelled society forward. Yet every leap has been accompanied by anxiety. 

Literature captured these fears memorably through Mary Shelley’s Frankenstein, where a human-made creation escapes its maker’s control. As artificial intelligence rapidly evolves, similar unease is resurfacing particularly within scientific and academic inquiry.

In what many nostalgically call the “golden era” of scholarship, research was slow, deliberate, and deeply human. Each source demanded careful reading to understand its methodology, conclusions, and the unanswered questions it left behind. The internet later simplified access, but comprehension still required effort and intellectual engagement.
Digital tools played a supportive role, helping with grammar or clarity rather than content. Knowledge advanced through discipline, not shortcuts. That rhythm changed dramatically in 2022, when AI text generators such as ChatGPT became accessible.

Follow The Standard
channel
on WhatsApp

Research, once measured in months or years, could now be produced in minutes with social media causing an “invasion of idiots into public spaces,” academia began facing a different invasion, one of unethical practices enabled by AI. Literature reviews, analyses, and even full papers could be machine-generated, raising profound ethical and technical concerns about the future of science. AI has made academic misconduct easier, faster, and harder to detect. Deep-learning systems can fabricate datasets, manipulate images, and generate experimental results that appear plausible at first glance.
While some AI-generated or falsified studies have been retracted, they are often discovered only through painstaking peer review or whistleblowing. This reality risks turning peer reviewers into forensic investigators, diverting time and resources from evaluating scientific merit to policing authenticity. Text generation poses an equally serious challenge. AI-produced papers may look original but often amount to algorithmic rearrangements of existing work. 

Worse still, AI tools frequently invent references nonexistent articles, misattributed authors, or incorrect titles. Such pseudo-scholarship undermines intellectual property, clogs academic journals, and erodes trust in scientific publishing. The difficulty of identifying AI-generated text further compounds the problem, allowing questionable research to circulate unnoticed.

In higher education, the implications are just as troubling where AI has intensified the “technological arms race” between cheating methods and detection systems. Students and even academics can submit work without conducting genuine research, blurring the line between assistance and deception. Misusing AI effectively becomes a form of misattributed authorship—claiming credit for work not personally done. 
While AI can also aid in preventing cheating through tools like watermarking or identity verification, its ethical use must be clearly taught and enforced. No chatbot can conduct laboratory experiments, test materials, or observe biological processes in real-world conditions. 

AI itself is not the enemy, used responsibly, it can enhance data analysis, language editing, literature searches, and discovery. The danger lies in treating it as a substitute for the scientific method rather than a complement to it. Science advances through curiosity, rigour, and engagement with reality—qualities no algorithm can replicate. As academia navigates this new era, it must reaffirm the values of integrity, proper attribution, and hands-on inquiry.
-The writer is a researcher (Energy Systems)

Follow The Standard
channel
on WhatsApp

Published Date: 2026-01-17 09:48:01
Author:
By Dr Xavier Musonye
Source: The Standard
By Dr Xavier Musonye

Add A Comment
Leave A Reply Cancel Reply

News Just In

A litany of theft yet no one accounts for it

January 17, 2026

Menengai Oilers bag Nakuru derby bragging rights once again

January 17, 2026

Menengai Oilers bag Nakuru derby bragging rights once again

January 17, 2026

AFC Leopards fight back to return to SPL summit

January 17, 2026
Crystalgate Group is digital transformation consultancy and software development company that provides cutting edge engineering solutions, helping companies and enterprise clients untangle complex issues that always emerge during their digital evolution journey. Contact us on https://crystalgate.co.ke/
News Central
News Central
Facebook X (Twitter) Instagram WhatsApp RSS
Quick Links
  • Kenya News
  • World News
  • Politics
  • Business
  • Opinion
  • Columnists
  • Entertainment
  • Gossip
  • Lifestyle & Travel
  • Sports
  • About News Central
  • Advertise with US
  • Privacy Policy
  • Terms & Conditions
  • Contact Us
About Us
At NewsCentral, we are committed to delivering in-depth journalism, real-time updates, and thoughtful commentary on the issues that matter to our readers.
© 2026 News Central.
  • Advertise with US
  • Privacy Policy
  • Terms & Conditions
  • Contact Us

Type above and press Enter to search. Press Esc to cancel.