02 December 2016

Post fact politics catches up to science communication


There was been much hand-wringing in political discussion the last few weeks about how we are living in a “post fact” world dominated by “fake news.”

Well, hi-di-ho, people, welcome to science education and science communication of the last few decades.

Fake news? Evolutionary biologists have been putting up with it people saying things like, “There are no transitional fossils” for-frickin’-ever. Even when you show them Archaeopteryx. We’ve been putting up with the Discovery Institute and Answers in Genesis who attack established science and chug along regardless of scientific facts and countless debunkings.

We’ve been through the argument that “If only people knew the facts, people would act different. The facts speak for themselves,” and seen how that has failed, and failed, and failed to budge public opinion on some of the best understood science out there. Unfortunately, even amoung scientists, this is “The facts speak for themselves” attitude is still common. People who say otherwise, like Randy Olson and Matthew C. Nisbet, have received way more ciriticism for pointing this out than they deserve.

Now the same strategies are not just confined to being deployed in a few hot button scientific topics, they’ve metastasized over the whole body politic in multiple countries.

It’s to our shame that we science educators and science communicators didn’t figure out effective ways to deal with those kinds of issues.

Picture from here.

29 November 2016

Tuesday Crustie: Shiny


If I’d known Moana had a giant decorator crab in it, I’d have pre-ordered tickets.


Tamatoa also gets the second best song in the movie, although even it pales in comparison to “You’re Welcome.”

28 November 2016

Are footnotes a way to game the Impact Factor?


One of Bradley Voytek’s 99 problems is strange journal demands:

Major journal said we can’t cite biorxiv papers; instead must reference them via footnotes.

I have been rankled by journals’ refusals to cite non-traditional sources before. But this journal wasn’t refusing to acknowledge to a source. It was refusing to acknowledge a source in a certain way.

This puzzled me momentarily, but I have a hypothesis. Any time a journal talks about fiddling with citations, there is a prime suspect as to why: the journal Impact Factor. I strongly suspect that footnotes aren’t counted in the calculations of journal Impact Factor like terminal references are, even though footnotes and a reference list in this case would serve the same purpose: to credit a source so that people can find it.

What a journal might have to gain by keeping pre-print servers out of citations? It doesn’t enhance the journal’s own Impact Factor. It doesn’t enhance anyone’s Impact Factor, for that matter. Denying citations to pre-print servers seems futile, since pre-print servers don’t have Impact Factors.

While pre-print servers don’t have Impact Factors, including citations to them might make it easier to collect data about their use. There seems little doubt that the majority of citation analysis is done by text mining and algorithms, rather than by hand. (Notwithstanding the contention by Brembs et al. (2013) that Impact Factors are often negotiated.)

For journals, the very act of data collection about pre-print servers might feel threatening to them. There are some researchers who want journals to die across the board and wouldn’t mind if pre-print servers (or something like them) rose up to take their place. If it becomes clear through citation analysis that more and more studies on pre-printe servers are being cited as reliable sources of information, the uncomfortable question for journals arises:

“What are journals for, exactly?”

Update, 29 November 2016: Bradley Voytek reports that the situation has changed:

The journal editors discussed and changed their policies to allow preprints with DOIs.

How interesting.

References

Brembs B, Button K, Munafò M. 2013. Deep impact: unintended consequences of journal rank. Frontiers in Human Neuroscience: http://dx.doi.org/10.3389/fnhum.2013.00291

Related posts

Why can’t I cite Mythbusters?

Picture from here.

No lead is safe


I normally tell people that I hate football. But homesickness makes you do funny things, so I tuned into the last quarter of the Grey Cup last night.

I was rewarded.

At some point, when the lead was still pretty big for Ottawa, one of the commentators said a CFL motto was, “No lead is safe.” I sort of snickered when I heard that. I would not have believed Calgary could score two touchdowns is less than two minutes, with the last coming in with something like 20 seconds left on the clock to force the game into overtime. What a thriller! At that point, it didn’t matter who won, you just had a great championship game.

The Redblacks are new, formed after I moved to the US. I looked a few things up about the team while the game was in progress. When I went to Wikipedia, I wondered if I was on some sort of time delay, because the entry said, “In the 104th Grey Cup, the Redblacks brought the Grey Cup back to Ottawa for the first time in 40 years.” What...? But... but... the game was still going.

When I cam back a few minutes later to get a screen grab of the jumped gun, I saw someone already had some fun with Wikipedia (click to enlarge):


Of course, Redblacks had the last laugh on the irate wiki contributor, pulling off the overtime win.

Even this football hater can’t resist an underdog victory. Congratulations to the Redblacks on their first Grey Cup win!

External links

Redblacks pull off huge upset to win 104th Grey Cup in OT
Redblacks player lays on the field long after everyone leaves, perfectly wrapping up the Grey Cup

22 November 2016

Watch me now


A “watchlist” has one major job: to intimidate. And boy, Professor Watchlist does that in spades.

The mission of Professor Watchlist is to expose and document college professors who discriminate against conservative students, promote anti-American values, and advance leftist propaganda in the classroom. 

And just like that, we’re in a new era of McCarthyism. The website is horribly vague on what confirmation or vetting goes into this list, what an “anti-American” value is, or what constitutes “leftist propaganda.”

I agree with one thing on this list: professors shouldn’t discriminate against conservative students. Because professors shouldn’t discriminate against anyone.

But hey, conservative students, your ideas have to compete in the free market of ideas and be supported with facts and evidence. That is, conservative students, you don’t get to cry “Discrimination!” if I say, “Evolution is the best scientific explanation we have for diversity of life on this planet” because you happen to be a conservative young Earth creationist.

The website says it’s a project of TurningPoint USA, but the link to it is not always predictable. A link on Twitter informed me that this is the brainchild of one Charlie Kirk.

I clicked on one entry at random, and it linked out to a site called Campus Reform, which I’m pretty sure I’d seen before. It’s part of the Leadership Institute, which describes itself as “Training conservative activists, students, and leaders since 1979.”

I agree with Dr. Becca. Universities need to talk about sites like this, publicly. I was also toying with something Trina McMahon did: flooding the site with “tips.”

And perhaps this is an apt moment to repost this:


Update, 23 November 2016: One of the creators of Professor Watchlist, Alana Mastrangelo, is super happy that people have taken to trolling the “tips” section of the website. Free publicity.

Another creator of Professor Watchlist, Crystal Clanton claims that the tip line to inform the site’s creators, “I pray for your deaths every day.”


To anyone who would consider writing something like this:


You’re not helping. Knock it off.

External links

Professor Watchlist

The 21-Year-Old Becoming a Major Player in Conservative Politics (from 2015)
David Perry discussing the Watchlist
Heather Cox Richardson on being added to the Watchlist
Exposing 'crazy radical professors': 12 of the best #trollprofwatchlist tweets
Professor Watchlist Is Seen as Threat to Academic Freedom
Teaching in a time of professor watchlists

21 November 2016

Keeping to schedule

My university publishes an academic calendar of holidays and exams well in advance of the semester. The university is closed this Thursday and Friday (American Thanksgiving). But since last week I’ve had a steady stream of students asking me if we are having classes on Wednesday, and I even got one asking me, “Are we having class today?” (Monday.)

My answer is, “Yes, the university is open and class is happening as usual, as per university policy.”

“Other professors are cancelling class that day.”

I cannot tell you how much this annoys me. I’m not so much annoyed by the students asking, but by my colleagues.

Professors who cancel classes because it’s close to a holiday aren’t being professional. People in other jobs and other professions don’t just get to randomly not show up to work. But professors can cancel class pretty much whenever they want. And someone would probably need to cancel a lot of classes before a department chair or other administrator caught on and commented. This is the sort of thing that gets legislators breathing down our necks with arguments that professors have no accountability.

It bothers me because students are getting short changed. Students pay tuition for a certain number of contact hour, and they should be upset that they are not getting the instruction and face time that they are paying for. I suspect that few students think of it this way, probably because many still see their relationship with professors as an adversarial one. A cancelled class is just less work, rather than missed opportunities to learn. Unfortunately, professors who cancel classes because it’s close to a holiday set a bad example and encourage this “classes are just another thing I have to do” view.

So no, my classes are not cancelled this week. Because I am a professional who takes my obligations seriously.

18 November 2016

How I learned advanced math from a fake documentary

Q. Are pseudoscience shows like Ancient aliens having a negative effect on the scientific literacy of Americans? (From Quora.)

If you want to rank the biggest negative impacts on the scientific literacy of Americans, I would not put pseudoscientific television documentaries on basic cable at the top of the list.

If you look at the issues where the public disagrees with scientists the most (climate change, evolution, vaccines, genetic modification of food), it’s not because they don’t have access to facts or that basic cable documentaries have mislead them. It’s because those issues have become political issues, and political leaders and political pundits actively promote narratives that are not scientifically justified.

That said, sensationalist TV shows like this do have an impact, and that effect is probably generally negative. Andrew David Thaler writes about some of the long term effects here: The Politics of Fake Documentaries. See also: Fish tales: Combating fake science in popular media.

But.

Here’s the thing. This “ancient aliens” genre is not new. It reaches back at least to the late 1960s when Chariots of the Gods? was published, and the early 1970s, when books like this were published:


Now, I read that book as a kid. Yes, there’s a lot of rubbish in it, and I was pretty gullible. I thought a lot of it was plausible. Hey, what did I know, I was a kid. Did reading that book hurt my science literacy?

Well, in all the credulous interpretations of archaeological data (“This ancient gold trinket described as a bird, but it looks like a jet plane!”), there’s a chapter that I think was called “Shortcuts in space-time” or something like that. And that’s the chapter I remember most about that book. It introduced me to the concept of a tesseract. That’s real mathematics and real science, and it stuck with me because as I learned more, I learned that those ideas were real.

That book was unscientific. But it made me curious. And I learned some new science that I probably wouldn’t have otherwise been exposed to until university, if then. And when I got to university, I discovered The Skeptical Inquirer and learned to be a little less gullible.

I’m not arguing that those are a good means of science education. But they are works of art more than science, and art – and our reactions to it – are complex.

17 November 2016

Why is neuroscience teaching software so bad?


Neuroscience is a discipline that is very well suited to using computer models. There are all sorts of elegant mathematical descriptions of how neurons generate action potentials, how signals propagate along the length of a neuron, how signals from neurons add up and contribute to firing action potentials or not, and more.

So why do so many pieces of software created to teach neuroscience suck so much?

Now let me say this: the teaching value of the software is often excellent. The problem is that the implementation is rough, twitchy, and out of date. So maybe my question is better phrased as, why does neuroscience teaching software suck in the context of using them today, in 2016?

Let me give a few examples. This year and last I’ve used Swimmy (Grisham et al. 2008). Students have to crack the neural circuit that makes a fish swim. It is an awesome exercise that challenges my students intellectually.

But when it starts...


You’re presented an MS-DOS command line. 1990s memory whiplash right there. The interface consists of lots and lots of windows you have to resize manually. And the Mac version is so out of date that it doesn’t run properly any more.

I tried some software at The Mind Project, including Virtual EEG (Miller et al. 2008). Virtual EEG has a cool and interesting premise. You can create different sets of pictures (say, photos of objects and photos of people), and the program shows averages of real EEG data that was generated by people viewing those pictures. But, again, the interface is kind of clunky and twitchy. It’s written in Java, and it still runs, but I ran into some refresh issues such that screens often didn’t refresh and display properly. It all worked, but was such a chore to get to the stuff I wanted.

Realizing that these efforts were done the better part of a decade ago, I went looking for mobile apps.

I only found one promising candidate, Neuronify (pictured above). This one, at first glance, seemed very promising. It runs on Android and iOS. The user interface is very clean. But it feels more like a neurophysiological sandbox for playing around than a teaching tool. You build stuff rather than being puzzles to solve. The commands are very limited. You can inject current into a cell, but you can’t specify by how much, for instance. I’m sure I could put it to good use, but I need to think about how to use it effectively.

The contrast between teaching software and textbooks is profound. Textbook publishers have massive teams keeping the content and presentation up to date. There are new editions every few years. Say what you will about the cost, nobody would deny the typical university textbook is a professional looking, polished document.

Compared to the effort that goes into textbooks, most teaching software feels like the equivalent of a bunch of photocopied pages, printed off an old dot matrix printer, stapled together. It’s done by a small team, done once for some fairly specific teaching purpose, and nobody invests any effort in keeping up to date after it’s out and some small paper in an educational journal is published. So even those of us who decide to use the software have to pay the pixel tax.

I want students to struggle. But I want students to struggle with the inherent complexities of cellular neuroscience. Students don’t type in command lines to run Pok√©mon Go or Snapchat, and I don’t want them struggling with command lines in class. It’s the least important thing of all.

References

Grisham W, Schottler NA, Krasne FB. 2008. SWIMMY: Free software for teaching neurophysiology of neuronal circuits. Journal of Undergraduate Neuroscience Education 7(1): A1-A. https://mdcune.psych.ucla.edu/modules/swimmy/swimmy-extras/grisham-etal-junef2008.pdf

Miller BR, Troyer M, Busey T. 2008. Virtual EEG: A software-based electroencephalogram designed for undergraduate neuroscience-related courses. The Journal of Undergraduate Neuroscience
Education 7(1): A19-A25. http://june.funfaculty.org/index.php/june/article/viewFile/629/628

External links

Swimmy
The Mind Project (including Virtual EEG)
Neuronify
A pixel artist renounces pixel art