You are currently browsing the tag archive for the ‘quantitative’ tag.

This collection of motion infographics from Bloomberg is pretty amazing. Each takes a single, complex issue and explains it using brief, animated infographic. Beyond simply being a visual expression of data, each video tells a story, leaving the viewer with a full understanding of the issue at hand. Granted, not everyone has the expertise (or budget) to employ motion infographics, but there are little lessons to be learned in each. Enjoy.

The U.S. Census Bureau Center for Economic Studies has long supported (for the past ~5 years) an online system for pulling area-based employment and residence data using a visual map-based selection tool called OnTheMap.  This software is fairly intuitive and fun to use, but can also be quite useful in exploring a specific market or region to understand where workers live and work, and how that has changed over time.

OnTheMap is useful for more than work location, however.  It’s a multi-layered mapping tool, with companion data on demographics, earnings, industry characteristics.  We’ve also used it to identify exact metropolitan statistical areas and radius ranges, to find transportation routes, greenspace, and tribal and military lands, and to simply better understand a physical marketplace.

For years, organizations like the Census Bureau relied heavily on point-in-time estimates, tables of statistics and physical and static maps for data exploration like this. As new systems come online, are developed further, and improved over successive versions, our ability to access information from our desktops is not only facilitated but empowered.

Okay, this one’s a little obtuse… 🙂  Check the article too, whew!

Wired: World’s Most Precise Clocks Could Reveal Universe is Hologram

We try to pass along great infographics, but we also get excited about fresh and interesting approaches to cartography.  And from time to time in conducting national-scale quantitative research studies, we have to dig deep into Census statistics from 2000.

Eric Fischer’s recently-posted Flickr photoset relies on 2000 Census stats and OpenStreetMap data, depicting racial and ethnic divides in a few dozen major American cities. His work was inspired by Bill Rankin’s map of Chicago’s racial and ethnic divides. The visualizations for Chicago, Atlanta, San Francisco, and Houston are particularly striking, and most maps feature some explanatory notes through use of mouse-overs.

Long Beach

"Race and Ethnicity - Long Beach" By Eric Fischer

This is great work, and I hope to one day see an update based on 2010 Census data to see, among other things, the changes in concentration of Hispanic-Americans, the differences in post-Katrina New Orleans, and the evolving makeups of Southeastern cities.  This is fascinating data and imagery that really makes one think about what a difference a decade makes.

“David McCandless turns complex data sets (like worldwide military spending, media buzz, Facebook status updates) into beautiful, simple diagrams that tease out unseen patterns and connections. Good design, he suggests, is the best way to navigate information glut — and it may just change the way we see the world.”

Source and Comments

I’ve recently come across several organizations and websites that aggregate and track facts.  The Long Now is a foundation that claims as its goal the fostering of long-term thinking (blog), and companies like Ambient Devices offer cool consumer electronic products that are designed to “datacast,” constantly streaming real-time facts that by their nature are always changing, like the weather, the stock market, oil prices, traffic congestion, etc. (They go well beyond kitchen-window digital thermometers, the “Orb” on the right is one of their products.)

But Samuel Arbesman, a research fellow at the Harvard Medical School and associated with the Institute for Quantitative Social Science at Harvard University, started a fascinating website and blog earlier this year that focus on “mesofacts,” facts that change slowly over time, but which are challenging to track. I’ve been checking the blog periodically to see the various charts and subjects they post.

“These slow-changing facts are what I term “mesofacts.” Mesofacts are the facts that change neither too quickly nor too slowly, that lie in this difficult-to-comprehend middle, or meso-, scale. Often, we learn these in school when young and hold onto them, even after they change. For example, if, as a baby boomer, you learned high school chemistry in 1970, and then, as we all are apt to do, did not take care to brush up on your chemistry periodically, you would not realize that there are 12 new elements in the Periodic Table. Over a tenth of the elements have been discovered since you graduated high school! While this might not affect your daily life, it is astonishing and a bit humbling.”Excerpt from Boston.com article by Arbesman

I’ve always felt a little challenged by retention of facts. So much of my personal approach to learning has been focused on comprehension and understanding, and pattern recognition, that the details sometimes seem to go, pardon the cliches, “in one ear and out the other,” or are “stuffed into the back of my mind somewhere.”  I can’t remember jokes to save a party, and I’m not even as good at music trivia as my friends expect me to be. I studied International Relations in undergrad, but learned about the UN of the 90s, and the political climate of the post-Cold War world; it’s been challenging keeping up with foreign affairs and the state of international communications over the past ten years.

You don’t have to be a trivia buff, a librarian, or a passionate scholar to appreciate tracking of mesofacts of some kind. We all have our interests and challenges in keeping up with the evolution of knowledge on those topics.  Your focus may be more academic, historic, entertainment, or even outright silly, but do remember to keep thinking and push yourself to keep up!

Note: We’re always seeking comments for our blog posts, but few people actually submit them!  Feel free to tell us about your fact-watching, and especially your sources for keeping up-to-date, in the thread below!

logo_bookofoddsSteve and I have been exploring the online reference site, The Book of Odds. Some of the site’s key functionalities are still in Beta, but for over three years they’ve been compiling odds to create a large database of “the odds of everyday life.” You can sign up for free and provide a little profiling information to begin exploring statements of probability related to your profile, or to anything you want to look up.

The idea is to explore the odds of something happening, and then to calibrate the probability in a comparison. If the topic you explore is included in the database (the four main current topic portals are Health & Illness, Accidents & Death, Relationships & Society, and Daily Life & Activities), you’ll get confirmed probability data on that topic, but you’ll also get leads on unexpected connections, as you compare unrelated events by their likelihood of occurring.

The site also has social and learning functions, and content aside from the odds database (newsletters, blogs, related links, etc.)  We’re just getting started exploring this resource, and brainstorming about how we can apply it to our day-to-day reference needs. It’s actually pretty challenging to think about life in terms of probability statements – thinking up queries to get started. But once you dig into the site, there’s quite a bit to learn – not only the small bites of data, but how to calibrate probability, and new approaches to classifying and comparing phenomena.

At what price would you consider this product to be cheap?

At what price would you perceive this product to be too expensive?

At what price would you consider this product to be priced so cheaply that you would worry about its quality?

At what price would you consider this product to be too expensive to even consider buying it?

These four very direct and intuitive questions form the basis of the Van Westendorp pricing exercise – a quantitative research technique that can actually yield robust and compelling data reflecting consumer demand. We’ve been thinking about the wide variety of quantitative analytical techniques we use in our work, and thought we’d provide a quick overview on this one.

The Van Westendorp pricing exercise is a price sensitivity measurement devised by a Dutch psychologist, Peter van Westendorp.  This technique uses four questions about a product or service (drafted more or less like those above) and requires the respondent to gauge prices that are too cheap and too expensive in context with the product or service’s offerings and perceived benefits.

Frequency distributions from these questions are derived and plotted, yielding the range of pricing options for the product. As the final step in this process, purchase intent is measured at the highest and lowest prices in the range of pricing options. The optimal price (i.e., the price which maximizes market share while generating the highest possible revenue) can then be computed, along with the precise range of acceptable pricing.

VW

The data points on the example chart are plotted a little loosely, but the point at which the Too Cheap and Too Expensive responses intersect is considered the Optimal Price Point (OPP). The intersection of Expensive and Too cheap yields the Point of Marginal Cheapness (PMC). At this price point, the number of people considering the product to be too cheap is the same as the number considering it to be expensive.

The intersection of Cheap and Too Expensive yields the Point of Marginal Expensiveness (PME). At this price point, the same number of people regard the product to be too expensive as regard it cheap. The range from PMC to PME is the Range of Acceptable Prices (RAP), or the Optimal Price Band.

We also conduct pricing studies using conjoint and discrete choice designs, but the Van Westendorp pricing method is the most efficient way to evaluate price sensitivity itself, as the resulting data resulting is easy to interpret, identifies an entire range of acceptable price points, and provides a solid basis to assess future pricing strategies, ensuring that the optimal price-value balance is established. Contact us if you’d like to learn more about this research technique.