VOL. I, NO. 1 • SUNDAY, JANUARY 19, 2026 • PRICE: ONE MOMENT OF ATTENTION
DULL & BORING
”Where the Receipts Go to Die and the Truth Learns to Swim”
EVERYTHING IS CONNECTED (BY VERY FRAGILE STUFF)
A special report on how we broke the machines that tell us what’s true—and the strange things happening now that we’ve noticed
Dear reader, welcome to a paper about forgetting.
Specifically, it’s about how modern civilization—which prides itself on precision, documentation, and receipts—has somehow engineered a world where the receipts keep vanishing. The DNA evidence might be manufactured. The hyperlink in the Supreme Court opinion leads to a cat food advertisement. The scientific paper your doctor relied on was produced by a factory in an undisclosed location and purchased by someone who needed a publication credit. The safety manual in the cloud has been updated seventeen times without anyone noticing, and nobody can tell you which version was in effect when the bridge was inspected.
This edition of The Dissolving Ledger brings you several interconnected stories that, taken together, paint a picture both alarming and oddly hopeful. A forensic scientist in Colorado allegedly manipulated DNA evidence for nearly three decades while her supervisors looked the other way—and a state legislature is now scrambling to prevent it from happening again. Swedish and Danish schools are spending hundreds of millions of euros to buy back the textbooks they threw away a decade ago, having discovered that children learn better when they aren’t being pinged by notifications every forty-five seconds. Scientific paper mills—yes, factories that manufacture fake research—have become so efficient that fraudulent studies may soon outnumber legitimate ones in certain fields. Half the web links cited in Supreme Court opinions now lead nowhere. And a major academic publisher decided that libraries shouldn’t be allowed to own books anymore, only rent them—prompting librarians to compose strongly-worded letters.
The connecting thread, which runs through all these stories like a frayed electrical wire, is this: we built systems that assumed honesty, permanence, and friction, then removed all three. Now we’re putting them back, one crisis at a time.
We hope you’ll stay. The news is dark, but the solutions are surprisingly analog.
❧ ❧ ❧
THE GOLDEN CHILD WHO ALLEGEDLY CHEATED FOR 29 YEARS
A Colorado DNA analyst faces 102 felony charges as more than a thousand criminal cases face review
The Colorado Bureau of Investigation called Yvonne “Missy” Woods their “golden child” of forensic science. She joined the agency in 1994, rose to become a go-to expert witness, and trained prosecutors and law enforcement officers across the state on the finer points of DNA evidence.
Woods now faces 102 felony charges including forgery, perjury, cybercrime, and evidence tampering. State investigators say she manipulated DNA test results in more than a thousand criminal cases over nearly three decades. Michael Clark, convicted of a 1994 murder largely on DNA evidence Woods had handled, saw his conviction vacated in April 2025 after independent retesting statistically excluded him as the source of crime scene DNA. He had already served twelve years of a life sentence.
The scandal’s mechanics reveal how forensic verification systems can fail catastrophically when they assume baseline integrity. According to state investigators, Woods allegedly altered DNA quantification values by adding zeros to decimal points—changing measurements like .00025 to .000025—making DNA amounts appear too small to profile, thereby avoiding additional testing that might contradict her conclusions. She allegedly deleted inconvenient data points, re-ran batches without documentation, and concealed possible contamination.
What makes the case a structural indictment rather than a story of individual pathology is the documented trail of ignored warnings. Colleagues raised concerns about Woods in 2014 and 2018. An independent audit described past CBI leadership as “autocratic” and “punitive” toward questioners. The peer review system that should have caught flawed work failed because reviewers sometimes approved complex DNA analyses in as little as sixty seconds.
THE NUMBERS TELL THE STORY
The remediation costs are staggering. Colorado has allocated more than $11 million to retest evidence and review impacted cases. Processing time for new rape kits spiked to over 500 days as resources were diverted to reviewing Woods’s past work. Denver police expanded their review to more than 1,300 sexual assault cases.
The cost of prevention would have been modest by comparison. Blind proficiency testing—where analysts receive samples of known composition disguised as routine casework—costs roughly $200,000 annually for a lab of CBI Denver’s size. Less than 2% of what taxpayers are now spending on remediation.
Colorado’s legislature responded with HB25-1275, the Forensic Science Integrity Act. The law mandates that crime lab employees report any knowing misconduct or significant procedural deviation within 14 days. District attorneys must notify defendants and victims if lab personnel involved in their case are found to have committed misconduct. Defendants gain explicit statutory rights for post-conviction relief based on forensic integrity failures.
The lesson extends beyond Colorado. When verification systems rely on trust rather than adversarial scrutiny, when peer review assumes good faith rather than checking raw data, the chain of custody for truth depends entirely on individual honor. And individuals, as the Woods case demonstrates, can fail for nearly thirty years running.
Infographic: The lifecycle of a DNA sample through a forensic lab, showing intervention points where manipulation could occur—at quantification, batch processing, and final reporting.
FOR FURTHER READING: PERSPECTIVES
PRO “Why Strong Forensic Oversight Protects Everyone” — Innocence Project
Argues that mandatory external oversight, blind proficiency testing, and transparent lab practices are essential to preventing wrongful convictions and maintaining public trust in forensic science.
Source: innocenceproject.org (2023)
CON “The Danger of Over-Regulation in Forensic Labs” — Criminal Legal News
Contends that excessive regulatory burdens could slow case processing, increase backlogs, and that existing accreditation standards—when properly enforced—are sufficient to catch misconduct.
Source: criminallegalnews.org (December 2018)
❧ ❧ ❧
THE FACTORY THAT MAKES SCIENCE OUT OF NOTHING
Paper mills are manufacturing fake research faster than journals can retract it
Someone will sell you a scientific paper with your name on it for roughly 25,000, depending on where you want it published. The data will be fabricated. The experiments will never have occurred. The peer review may be conducted by accomplices. And unless you’re unlucky, the paper will join the scientific record and start accumulating citations.
This is the paper mill industry, and according to a study published in August 2025 in the Proceedings of the National Academy of Sciences, it is growing faster than the legitimate scientific enterprise can contain it.
The numbers are disorienting. One estimate suggests at least 400,000 papers published between 2000 and 2022 show hallmarks of having been produced by paper mills. Only 55,000 were retracted or corrected during that period. Researchers at Northwestern University found that fraudulent papers are doubling every 1.5 years, while retractions are doubling only every 3.5 years. At current rates, the researchers predict that only 15-25% of paper mill products will ever be retracted.
“It’s like emptying an overflowing bathtub with a spoon.” — Luis Amaral, Northwestern University
THE ECONOMICS OF DECEPTION
Paper mills exploit the “publish or perish” culture of global academia, where publication counts drive hiring, promotion, and funding. The industry has industrialized to meet demand. A January 2025 investigation linked a single network—designated “Tanu.pro”—to 1,517 fraudulent papers across 380 journals, involving more than 4,500 scholars from 46 countries. Springer reported receiving over 8,400 submissions tied to this network; roughly 80 made it through peer review.
The fraud leaves forensic fingerprints. Paper mills employ “tortured phrases”—bizarre constructions produced by running text through synonym-replacement software to evade plagiarism detection. “Surface region” instead of “surface area.” “hard work” becomes “hard struggle.” The Problematic Paper Screener has catalogued more than 7,500 such phrases, with “surface region” appearing in 42,500 published papers.
The downstream contamination is what keeps research integrity experts awake. When fraudulent papers enter the literature, they get cited. Those citations inform systematic reviews. Reviews shape clinical guidelines. A 2025 analysis in JAMA found that 35% of meta-analyses experienced at least a 10% change in effect estimates after eliminating studies later retracted for fraud—with some conclusions entirely invalidated. As of late 2024, 157 clinical guidelines still referenced tainted meta-analyses.
Cancer research may be the most compromised field. According to researchers, a huge fraction of the cancer literature is now “probably completely unreliable.”
THE AI ARMS RACE
Generative AI has changed the calculus. Paper mills can now produce synthetic datasets that mimic the statistical noise of genuine experiments and fabricated images that possess unique background patterns, appropriate contrast levels, and realistic experimental artifacts.
Detection tools are struggling to keep up. A February 2025 study evaluating AI detection tools against ChatGPT-generated Western blot images—a common type of laboratory result—found catastrophic performance. One tool achieved 95% sensitivity but only 54% specificity, meaning it would falsely accuse nearly half of honest scientists of fraud while catching most actual fakes. Another tool missed over 80% of the AI-generated fakes entirely.
The Royal Swedish Academy of Sciences hosted a 2025 conference that produced the “Stockholm Declaration” on reforming scientific publishing. Signatories called for moving away from publication-count metrics, establishing independent fraud detection bodies, and potentially criminalizing industrial-scale scientific fraud.
But reform faces fundamental obstacles. Journals make money by publishing papers. Authors need publications for their careers. Institutions celebrate high productivity. In the short term, except for science and truth, everybody appears to win.
Infographic: The scale of paper mill operations—a network diagram with Tanu.pro at center connecting to 380 journals across 46 countries, with detection and retraction rates shown as contrasting flows.
FOR FURTHER READING: PERSPECTIVES
PRO “Stamp Out Paper Mills—Science Sleuths on How to Fight Fake Research” — Nature
A group of scientists who specialize in detecting fraudulent research outline five essential steps to combat industrialized misconduct, including reforming research incentives and funding independent detection efforts.
Source: nature.com (January 2025)
CON “The Strain on Scientific Publishing” — Quantitative Science Studies
Argues that focusing narrowly on paper mills distracts from deeper systemic issues in scientific publishing, including unsustainable growth, inadequate peer reviewer compensation, and the need for wholesale reform of how research is evaluated.
Source: Hanson et al., QSS 2024 (2024)
❧ ❧ ❧
THE SUPREME COURT OPINION THAT LEADS TO NOWHERE
Half the web links in the nation’s highest court decisions have rotted away
Click the hyperlink in a Supreme Court opinion citing an important government statistic, and there’s a coin flip’s chance you’ll find what the Justice was referencing. You might instead encounter a 404 error, an unrelated webpage, or the digital equivalent of static.
In 2014, Harvard law professors documented that more than 70% of URLs in Harvard legal journals and 50% of URLs in Supreme Court opinions suffered from “reference rot”—the link either failed entirely or led to content different from what was originally cited. A decade later, the problem has not improved. A January 2026 longitudinal study tracking web citations across twenty years of scholarship found that only 38% of citations older than ten years remain accessible, with 15% “permanently unrecoverable” by any method.
The legal implications are profound. The common law principle of stare decisis—Latin for “to stand by things decided”—presumes that the reasoning of past decisions remains accessible. If the evidence underpinning a ruling vanishes, the foundation of jurisprudence becomes unverifiable.
THE MECHANICS OF DECAY
Link rot is the simpler problem: URLs stop working. Content drift is subtler and more insidious: the link remains active, but the underlying content has changed. Government webpages get updated to reflect new administrations. Corporate policies evolve. Academic resources are reorganized. The citation still points somewhere, but no longer to what was originally referenced.
Commercial domains (.com) show the worst performance: only 42% remain accessible over time. Government domains (.gov) fare better at 78%. Educational institutions (.edu) lead at 93%. Static PDF files survive at 92%; database-driven dynamic content at only 41%.
Solutions exist but remain underadopted. Perma.cc, developed by Harvard Law School’s Library Innovation Lab, creates archived snapshots of webpages at the moment of citation—permanent links that cannot drift. The 22nd edition of The Bluebook (2025) now states that all online content cited by an author should be captured and stored in a permanent setting. Over 150 journals, courts, and universities use Perma.cc.
But Google’s decision to discontinue cached page views removed a traditional fallback for recovering disappeared content. The company’s cessation of goo.gl URL shortening means those shortened links will soon return 404 errors. The 9th U.S. Circuit Court of Appeals maintains a public database of opinions with PDFs of all cited web content—but it remains an exception rather than the norm.
“Broken links in the bibliography are, in effect, broken links in an argument.”
When the Judicial Conference of the United States—an august body overseen by the Chief Justice—issued guidelines for citing Internet materials in judicial opinions, their announcement was posted to the web. Researchers attempting to cite that very announcement years later discovered the link had rotted.
Infographic: A decay timeline showing how a Supreme Court citation degrades over time—Year 1: 100% accessible; Year 5: 87% accessible; Year 10: 62% accessible; Year 15: link vanished or content changed.
FOR FURTHER READING: PERSPECTIVES
PRO “Perma: Scoping and Addressing the Problem of Link and Reference Rot in Legal Citations” — Harvard Law Review
The foundational paper making the case for systematic web archiving in legal scholarship, arguing that libraries must take a central role in preserving the integrity of cited sources.
Source: harvardlawreview.org (2014)
CON “The Costs of Over-Citation” — Yale Journal of Law and Technology
Argues that the focus on preserving every citation may be misplaced—many links are to ephemeral or low-value content, and resources might be better spent improving the quality of citations rather than archiving everything.
Source: Yale JOLT Archives (various)
❧ ❧ ❧
A LIBRARY THAT OWNS NOTHING ISN’T A LIBRARY
Major academic publisher ends book ownership for libraries, sparking outcry
In February 2025, Clarivate—parent company of ProQuest and its Ebook Central platform—announced that libraries would no longer be permitted to purchase digital books outright. Instead, institutions would “pivot to subscription-based access models.” The company framed the shift as responding to evolving needs and AI-driven discovery requirements.
The library community responded with what might be charitably described as vigorous disagreement.
“Our entire acquisitions strategy is being upended with this change. Fewer choices for content providers, fewer options for digital ownership, further erosion of the first sale doctrine, and monopolies are bad for libraries!!!” — Librarian on social media
The Association of College and Research Libraries (ACRL) took the position that both subscription and perpetual-access options should remain available. Seven UK academic and research organizations sent a joint letter to Clarivate’s CEO calling for an immediate halt to the plans. By March 2025, Clarivate extended its implementation timeline to June 30, 2026, but the fundamental direction remained unchanged.
THE PROBLEM WITH RENTING KNOWLEDGE
When access to a medical handbook or engineering specification depends on continuous subscription renewal, institutional memory becomes contingent on budget cycles. Historical research requiring access to specific past editions becomes problematic. The version a professional relied upon at the moment of action may not be recoverable.
Contemporary digital publishing platforms frequently update content without clear versioning. Unlike physical reporters of the twentieth century, where a printed error required a formal errata sheet, digital databases can undergo “silent updates”—a headnote rewritten, a case flag changed, a specification modified without visible audit trail. The “law” as it appears on screen in 2026 may differ subtly from the “law” as it appeared in 2020, yet both are presented as authoritative.
Clarivate’s own policies acknowledge the instability. Their announcement noted: “We will continue our bi-annual schedule of title removals from subscriptions in June and December. There may be occasional off-cycle removals due to legal reasons or loss of publisher rights.” Librarians observed that losing essential materials mid-academic-year could create nightmarish scenarios for students and researchers.
Competitor EBSCO responded within 24 hours, explicitly affirming its commitment to perpetual access options. “EBSCO has a long-standing commitment to these services,” said Jon Elwell, senior VP of books, “and we have no plans to discontinue them.”
The deeper issue is who controls the permanent record. A surgeon consulting a digital edition of Gray’s Anatomy in 2050 may have no way to verify that the text matches what was authoritative in 2025. An engineer referencing a safety specification may be reading version 47 while the bridge was certified under version 31. A lawyer citing precedent may be unable to reconstruct what the law said when a contract was signed.
“A library that owns nothing isn’t fulfilling its mission.”
Infographic: Comparison table showing Perpetual License vs. Subscription Model across dimensions of ownership, cost structure, preservation value, content stability, and risk on cancellation.
FOR FURTHER READING: PERSPECTIVES
PRO “Introducing ProQuest Ebooks: The World’s Largest Scholarly Ebook Subscription” — Clarivate
Clarivate’s case for subscription models, arguing that subscriptions provide broader access, lower barriers to discovery, and align with how students actually consume content in the streaming era.
Source: clarivate.com (August 2025)
CON “What Happens If Libraries Can’t Buy Ebooks?” — Inside Higher Ed
ACRL president Leo S. Lo argues that eliminating ownership options forces libraries into cycles of ongoing payments, intensifying concerns about collection stability and the ability to preserve scholarly materials.
Source: insidehighered.com (March 2025)
❧ ❧ ❧
THE COUNTRIES THAT THREW AWAY THEIR TEXTBOOKS AND WANT THEM BACK
Sweden and Denmark spend hundreds of millions to restore analog learning
Sweden embraced educational digitization earlier and more comprehensively than most nations, introducing tablets in preschools and replacing textbooks with screens across all grade levels. By 2023, the consequences had become impossible to ignore.
Swedish fourth-graders’ reading scores dropped from 555 in 2016 to 544 in the Progress in International Reading Literacy Study—still above the European average, but a troubling decline for a nation that had been a global leader. The Karolinska Institute, one of the world’s most prestigious medical universities, issued a position paper: “There’s clear scientific evidence that digital tools impair rather than enhance student learning.”
The Swedish government responded with what officials called a return to “back to basics.” Beginning in 2023, the government committed the equivalent of €230 million to purchasing physical textbooks, with the stated goal of one printed textbook per student per subject. Education Minister Lotta Edholm declared: “The mass digitization of the school has been a mistake. Screens have been allowed to displace books, paper and pencils. We are now breaking that trend.”
The curriculum for preschools was amended to remove requirements for digital learning tools. For children under age two, only analog learning tools—such as books—are now mandated. Starting in 2026, a nationwide school phone ban will take effect for the full school day.
DENMARK FOLLOWS
Denmark announced in October 2025 that mobile phones will be banned in all schools and after-school programs beginning in the 2026/2027 school year—not as guidance left to individual headmasters, but as national directive. Education Minister Mattias Tesfaye put it bluntly: “Mobile phones in schools and after-school clubs create disruption and stress in children’s everyday lives. I’m pleased that we’re now tackling this so we can focus on what really matters—learning, community and peace of mind.”
The Danish ban is comprehensive, covering smartphones and smartwatches, with strict exceptions only for health monitoring, school-owned devices under teacher supervision, and students with disabilities requiring assistive technology. Schools will also be required to block websites unrelated to learning, including social media, gaming, and streaming platforms.
A 2024 Danish Ministry of Education survey found that more than one-third of pupils felt distracted by digital devices during lessons. Data from Scotland’s Children’s Commissioner revealed that 10- and 11-year-olds were spending up to nine hours daily on screens during weekends.
THE EVIDENCE MOUNTS
Research published in December 2024 concluded that digital reading doesn’t provide the same comprehension benefits as print reading. For younger students—those in elementary and middle school—reading on screens actually lowered reading comprehension skills compared to paper books.
The Swedish Agency for Media reported in September 2025 that among 9-12-year-olds, average daily digital device use had decreased by 40 minutes since 2022. The share of 9-year-olds without cellphones almost doubled. One major Swedish electronics chain reported that sales of “dumb phones” tripled from 2022 to 2024.
The implications extend beyond primary education. If early digital exposure undermines foundational literacy and concentration, effects compound through professional training. Medical students, law students, and engineering students who struggle with sustained reading will struggle with the dense textual traditions of their professions.
What Sweden and Denmark are recovering is something the digital revolution promised to make obsolete: friction. The physical book cannot be updated silently. It cannot notify you. It cannot track you. Its content is immutable and verifiable. These are not bugs but features.
Infographic: Timeline showing Sweden’s journey from 2014 tablet rollout through declining reading scores (2021) to the €230 million textbook reinvestment (2023-2026), with Denmark’s parallel trajectory.
FOR FURTHER READING: PERSPECTIVES
PRO “Sweden Went All In on Screens in Childhood. Now It’s Pulling the Plug.” — After Babel
A detailed account of Sweden’s reversal, arguing that the evidence against educational technology for young children is now overwhelming and that other nations should follow Sweden’s lead.
Source: afterbabel.com (November 2025)
CON “The Screen That Ate Your Child’s Education” — Salt Lake Tribune
Opinion piece arguing that the problem isn’t screens per se but school-issued devices without adequate content filtering, and that banning technology altogether throws out genuine educational benefits.
Source: sltrib.com (November 2025)
❧ ❧ ❧
WHEN THE SEAL BREAKS, THE FILE LIES
The emerging technology that could make truth permanent—or create a two-tier information economy
At the moment a laboratory instrument captures a result, a hardware security module cryptographically signs the file. If anyone later changes a single pixel or data point, the mathematical seal breaks. The tampering becomes computationally detectable.
This is the premise of the Coalition for Content Provenance and Authenticity (C2PA)—a standard developed by Adobe, Microsoft, Intel, the BBC, and more than 200 other organizations. Rather than trying to detect fakes after the fact, C2PA creates a tamper-evident chain of custody from the moment of capture through publication.
Think of it as a digital seal that shatters if the wax is disturbed.
The C2PA specification, which released an updated version in May 2025 covering AI workflows, works by embedding “Content Credentials” into files—cryptographically signed records of origin, editing actions, and ownership. A news photograph could carry metadata showing which camera captured it, when, where, and every editing operation performed afterward. If someone splices in material from a different image, the credential invalidates.
The National Security Agency issued guidance in January 2025 recommending Content Credentials as part of a “multi-faceted approach” to combating synthetic media and AI-generated disinformation. The EU AI Act requires AI content labeling; C2PA provides one compliance pathway.
THE LIMITS OF VERIFICATION
But Content Credentials have limitations. The technology is opt-in—nothing prevents someone from distributing unsigned media. Bad actors can sign false information about genuine content. One media company announced use of C2PA verification for a video, then had to issue a correction when the “verified” video turned out to have been misleadingly edited.
Researchers have pointed out that C2PA creates the possibility of a “two-tier” truth economy: verified content trusted by elites and institutions, while the general public navigates an ocean of unverified, potentially synthetic material.
The underlying principle—cryptographic provenance—extends beyond media files. Laboratory instruments could sign results at capture, creating unforgeable records. Technical documentation could carry multi-signature approval for safety-critical changes. Legal citations could include archived snapshots that cannot drift.
Illinois passed the Digital Assets and Consumer Protection Act in 2025, requiring custodians of digital assets to maintain 1:1 reserves and segregate customer holdings—forcing digital assets to behave like physical gold in a vault. The law represents the same instinct: re-imposing the rules of physical custody onto inherently mutable digital instruments.
The common thread across these solutions is the deliberate reintroduction of friction. Cryptographic signatures require computation to create and verification to check. Legislative mandates force organizations to pause, report, and review. Physical books require sustained attention.
The question is whether we can build verification infrastructure fast enough to rescue the foundations before they finish dissolving.
Infographic: Provenance Chain diagram showing glass-to-glass integrity—from camera sensor through cryptographic signing to verified storage and retrieval, with points showing how tampering would break the mathematical seal.
FOR FURTHER READING: PERSPECTIVES
PRO “Strengthening Multimedia Integrity in the Generative AI Era” — NSA/CSS
U.S. government guidance recommending Content Credentials as critical infrastructure for combating AI-generated disinformation, arguing that provenance standards must become part of the defensive toolkit.
Source: media.defense.gov (January 2025)
CON “Privacy, Identity and Trust in C2PA: A Technical Review” — World Privacy Forum
Detailed analysis raising concerns about C2PA’s privacy implications, potential for surveillance, vulnerability to manipulation by sophisticated actors, and the risk of creating false confidence in “verified” content.
Source: worldprivacyforum.org (2024)
❧ ❧ ❧
EDITORIAL: THE COST OF CERTAINTY
The stories in this edition share a common origin: the frictionless expansion of the early digital era. For thirty years, we pursued the removal of barriers to information creation, modification, and transmission. We largely succeeded. And in succeeding, we unmoored the concept of the permanent record.
We built forensic laboratories that assumed analysts would follow protocol. We built academic incentive systems that assumed researchers would report honestly. We built legal citation systems that assumed web content would persist. We built educational technology that assumed screens would enhance rather than distract. We built publishing models that assumed ownership was less important than access.
Each assumption made sense in isolation. Collectively, they created a world where verification became optional and permanence became expensive.
The correction now underway is not a return to the past. Sweden uses digital systems for school administration while returning to paper for instruction. Illinois applies cryptographic requirements to digital assets, forcing them to behave like physical objects under custody. Forensic reforms mandate blind proficiency testing—analog verification of digital results. The emerging C2PA standard uses software to create tamper-evident seals that mimic the behavior of physical wax.
What we are learning is that truth requires weight. Facts must be anchored to survive. Whether that anchor is paper, mathematics, or law, the drift must be arrested.
The alternative is a future in which every citation requires an act of faith, every guideline may have silently changed, and every expert’s testimony rests on evidence whose chain of custody has been irretrievably broken.
The institutions that thrive in coming decades will be those that prioritize stewardship of the permanent record over convenience of the digital moment. That may mean slower publication, fewer papers, more expensive books, and less seamless access. It will certainly mean friction.
But friction, it turns out, is what truth feels like.
FOR FURTHER READING: PERSPECTIVES
PRO “The Case for Reform of Scientific Publishing” — International Science Council
Comprehensive argument for wholesale reform of how science is published, evaluated, and rewarded—including moving beyond publication counts to recognize quality, integrity, and social impact.
Source: council.science (2024)
CON “Redefining Publishing: Why We’re Moving Beyond the Article” — Research Information
Argues that the focus on “permanent records” misses the dynamic nature of knowledge, and that the future lies in living documents, continuous updating, and moving beyond the static article format entirely.
Source: researchinformation.info (October 2025)
❧ ❧ ❧
PRODUCTION NOTE
This edition of Dull & Boring was produced through collaboration between human editorial judgment and AI research assistance. The underlying research report was synthesized from primary sources including court documents, legislative texts, academic studies, and verified journalism. Individual claims should be independently verified before consequential use.
All Pro/Con perspective links were verified as of publication date. Given that link rot is, ironically, one of the subjects of this edition, readers may wish to archive any resources of particular interest.
Your skepticism remains appropriate and encouraged.
COMING NEXT
Next edition: The Synthetic Expert—an investigation into AI-generated academic reviewers, automated journal editing, and the frontier where human and machine scholarship become indistinguishable. Also: What happened when one university tried to verify that all its faculty actually existed.
© 2026 Dull & Boring. All rights reserved.
Editor: Research conducted via Claude | Submissions: Use caution with hyperlinks.