Blog Post 2: Do Something!

March 17, 2026

I’m not big on TV shows, but one of my all-time favorites is House MD (if you’re not familiar with the premise, spend a couple minutes reading the Wikipedia synopsis; don’t worry, this blog post will still be here when you get back). In a fourth season episode, House and his team are discussing a diagnosis of a patient with a rare—and in this particular case, fictitious—condition when one senior member of the team mentions that this particular diagnosis could be caused by such a wide variety of environmental exposures that it is nearly useless to consider such an etiology. To this, House responds with the statement, “…Since it’s on the something side of nothing, I thought we’d go with it.” Upon re-watching this episode, I realized that this sentiment is foundational to my statistical practice and to my mentorship approach.

Growing up, throughout my K-12 schooling, there was almost always not only a distinctly “correct” answer, but there was almost as often a “right” method for arriving at the “correct” answer. This could be due to a variety of reasons (standardized testing, perhaps?), but it was a consistent thread, nonetheless. When I began as a graduate student, I expected this attitude to dissipate, but it hadn’t: Many of my classmates asked a myriad of surgically-precise questions about how to do every single assignment and project. Once I began teaching an undergraduate educational psychology course for pre-service secondary teachers (grades 6-12, all content areas), I noticed this same trend with my students asking these surgically-precise questions about every aspect of a project or test. You, the reader, may think this makes sense, and in some way, of course it does: nobody wants to do something wrong, especially if doing so could’ve been alleviated had they asked how to properly do it! However, I have to diverge from this opinion, at least to a certain extent. Of course, what I’m going to say does not apply to all people in all situations, just as a general rule by which I operate. Caveat emptor.

Admittedly, I was initially annoyed by my classmates’ and students’ seemingly basic, direct, questions. It was almost as if any level of agency and creativity had been completely eliminated and replaced by an intractable need to attain the prescribed “correct” answer. So, I started responding to some of my students’ questions with the phrase, “I don’t know. Try something.” Oddly enough, while I had removed the guardrails, I had also removed the proverbial “box” within which my students found themselves needing to think. The results were mixed, as would be expected. Some of my students adapted well and created lesson plan unit projects that were well beyond the scope of what they were expected to furnish (yes, I had a rubric, directive, and conversation; this wasn’t a pure sandbox), and it was evident they were proud of what they had crafted. Other students fell apart and resorted to lesson plan projects that looked and flowed more like line-item checklists; not poor quality, just not very creative, personal, or adaptable. Nonetheless, my students’ projects were honest, and they kept improving each semester I taught with only minimal additional directive about the “correct” thing to do.

This approach has persisted with the junior statisticians I mentor in the lab. When a junior statistician starts, they have a decent amount of training and shadowing of the more senior folks before they are given the opportunity to lead. In one such project, one of my mentees and I were working on a power analysis for a two-way ANOVA. So, I outlined what our known or hypothesized parameters were, what our unknowns were, what software I used, and had my mentee try running the power analysis herself. She admitted not having done one before “in real life,” to which I responded, “Well, try something. It’s easier to refine when you have something to work with.” So, she tried it and had some success. We refined what she wrote to share with our collaborator and I explained why I phrased the explanation the way I did, why we kept some parts of the narrative the same, why we changed others. The next time around with the same mentee, we were intending to run a power analysis for a mixed ANOVA, so with less explanation—and only one “real” power analysis in her repertoire—I told my mentee to do something. And she did!

As I’ve already stated, this is not the correct approach for all people in all situations, but it has worked very well for me and my mentees and students in a wide variety of contexts. The point to all this is that when we focus on getting the “correct” answer, we lose scope of the larger picture, the larger purpose of whatever we’re working on, and become mired in attempting to make everything look like it does in a textbook or on R-Bloggers. These are great resources, but the fact remains that real life is messy. Real research is messy. Real data can be downright ugly. And we, as statisticians, need to be able to adapt. Many of us find ourselves stuck in a perpetual state of “analysis paralysis,” forever attempting to find the “correct” solution (that might not exist) to a problem that doesn’t look like anything we’ve seen before. Doing something gives you a starting point. Even if you’re wrong, it gives you information you didn’t have before. It gives you something upon which to build, something to refine. If you do nothing, you end up with… nothing.

As a mentee, I hope you are in a place where you can try things that might not be the right solution so you can work your way into the right solution. As a mentor, I hope that you encourage your mentees to think outside the box. Don’t worry, if they’re really that far off, you—and likely they—will know it. Maybe a linear regression isn’t exactly right, but if it sets you on the path toward the regression tree that ends up being the best solution for your problem and your collaborators, then I’d say it was a good learning experience. So… do something!

Post