Building a Quasi-Random Password Generator in Mathematica

Here’s a simple, down-and-dirty password generator in Mathematica. I wrote it in version 9, but it should be backward compatible to . . . I don’t know, way back. There’s nothing really fancy or artful here, it’s just something that I find useful.

You’ll note that the function takes two inputs; the first is a binary indicator of whether you want to include non-alphanumeric characters (0 for everything, or 1 for alphanumeric only), and the second input is the number of characters you want your password to contain.

cut[charlist_] := RotateLeft[charlist, RandomInteger[{10, 58}]]; shuffle[charlist_] := Flatten[Transpose[Partition[charlist, 34]], 1]; pwdFn[anonly_: 0, len_] := Module[{chars, charsX, nums, set, numcuts, inter, ret, fin}, charsX = {"a", "b", "c", "d", "e", "f", "g", "h", "i", "j", "k", "l", "m", "n", "o", "p", "q", "r", "s", "t", "u", "v", "w", "x", "y", "z", "A", "B", "C", "D", "E", "F", "G", "H", "I", "J", "K", "L", "M", "N", "O", "P", "Q", "R", "S", "T", "U", "V", "W", "X", "Y", "Z", "!", "@", "#", "$", "%", "^", "&", "*", "-", "_", "+"}; chars = Drop[charsX, -11]; nums = {"1", "2", "3", "4", "5", "6", "7", "8", "9", "0"}; set = If[anonly == 1, Join[chars, nums], Join[charsX, nums]]; numcuts = RandomInteger[{233, 6765}]; inter = Nest[shuffle[cut[#1]] &, set, numcuts]; ret = RandomChoice[inter, len]; fin = StringJoin@ret; Return[Style[fin, FontFamily -> "Courier", FontSize -> 14]]];

pwdFn[1, 16]

gYf2RcR96Z7NYgKZ

pwdFn[0, 16]

02ktc%bl*U6v@+GD

Posted in Mathematica | Leave a comment

Excel Random Numbers from Various Distributions

As a sort of companion to the most recent post dealing with alpha-Stable random values in Excel, here’s a VBA module that will generate quasi-random numbers from any of the following distributions:

  • Standard Normal(0, 1) distribution
  • Normal(μ, σ) distribution
  • Log-normal(λ, ρ) distribution
  • Chi-square(n) distribution
  • Student’s T(n) distribution
  • Weibull(η, σ) distribution
  • Pareto(α, κ) distribution
  • Exponential(α) distribution
  • Rayleigh(α) distribution
  • Cauchy(μ, γ) distribution

Note that you’ll need to choose a method of generating Uniform random variates. At the same time, be aware that Excel’s built-in quasi-random number generator, RAND(), is notoriously weak. I prefer to use the Mersenne Twister VBA module available here, and the random number module linked above incorporates the Mersenne Twister module’s nomenclature.

You’ll also need to apply your own random seeding algorithm. Below is a random seed function that utilizes your computer’s internal clock:

Public Function randseed()
' Generates a positive integer value
' from the Excel Date/Time reading

    Application.Volatile
    Application.ScreenUpdating = False
    With Application.WorksheetFunction
    
    Dim currtime As Date
    Dim stime As String
    Dim sval As Double
    Dim inter As Double
    currtime = Now
    stime = CStr(Now)
    sval = TimeValue(stime)
    inter = (sval * 10 ^ 11) / .RandBetween(1000, 10000)
    randseed = Round(inter, 0)
    End With
    Application.ScreenUpdating = True
    
End Function
Posted in Business, Excel, Finance & Investing | Comments Off

alpha-Stable Random Values in Excel

It was a good day all around when Microsoft brought back macros in Excel 2011 for Mac.

Here’s a VBA routine that will generate a 1-D array of pseudo-random numbers from the alpha-Stable distribution. Note that you’ll need to include this excellent Mersenne Twister module to generate the uniform random variates. Please further note that you’ll need to provide the parameter values for alpha, beta, gamma, and delta; this routine won’t fit data.

It’s a helpful tool if you’re noodling on Excel and just want to run some quick return scenarios on a particular asset.

Posted in Business, Excel, Finance & Investing | Leave a comment

Portfolio Simulation in Mathematica (edited for version 9)

Last year I posted a demonstration of copula-based portfolio simulation in Mathematica version 8. Since then, Mathematica has released version 9, which includes lots of new functionality that will be useful for investors and managers – especially in the area of statistics.

Also as I noted in the earlier post, one problem with multivariate simulation in Mathematica is excessive computation time. Using explicit Mathematica functions, it can take an hour or more to produce a 10^4 matrix of simulated observations for a portfolio with fewer than a dozen assets. And since ten thousand observations is a bare minimum for serious analysis, Mathematica’s computation time is a serious detriment. Another purpose of this article is to present a neat workaround that returns a same-sized matrix of simulated data in just a few seconds.

While I’m tempted to take all the unchanged content from my earlier post and just incorporate it by reference in this article, I think readers using version 9 and later will find it more helpful to have all the information in one place. The earlier article is still a working, albeit less content-rich, example of portfolio simulation in Mathematica version 8, and it can still be utilized by those who have not yet updated to version 9.

As before, we simulate portfolio daily returns using a Student’s T copula. This time, though, we’ll be using a different set of daily stock prices, in part because I want to illustrate the ease with which Mathematica handles negative correlations in the context of multivariate distributions.

First the disclaimer: the following simulation and the particular stock price data referenced herein are purely illustrative. Absolutely no warranty or recommendation on the part of the author or site is intended, including no recommendation or opinion whatsoever with respect to the specific equities chosen or their suitability as an investment. Further, the Mathematica code herein is offered as-is, and with no warranty whatsoever, including warranties of merchantability or fitness. Use the following code at your own risk, but under no circumstances should any reader use the following methodology and/or Mathematica code for actual investing.

Data Import and Notional Portfolio

Assume we have a 5-asset portfolio composed of four small-cap equities with a recent history of high performance (EGHT, IOSP, MG, and MGAM), and as a simple hedge, we’ll arbitrarily choose ProShares Dow UltraShort (DXD) as our fifth asset.

Although Mathematica will fetch closing prices and a few select fundamentals with respect to a given equity, for this example it’s easier to download the table of daily log-deltas and corresponding dates for the notional portfolio here, the characteristics of which can be seen below:

. empchart

cumretp
Parenthetically, I should add that the fundamental data available through Mathematica’s built-in FinancialData function are in my opinion much too limited to use Mathematica as a stand-alone application for managing a portfolio. No doubt this is deliberate in light of Wolfram’s recently introduced Finance Platform. Since at present the Finance Platform is available only for Windows, I have no experience with it.

Portfolio Descriptive Statistics

First we construct a function that will return a list of useful descriptive statistics for a corresponding array of daily returns data:

Clear[descripStat];descripStat[data_]:=Module[{drift,volatility,autocorr,medi,fin},
drift[x_List]:=Mean[x]*252;volatility[x_List]:=StandardDeviation[x]*Sqrt[252];
autocorr[x_List]:=Module[{xl=Drop[x,-1],x2=Drop[x,1]},Correlation[x1,x2];
medi={N[Length[data]],N[Variance[data]],N[StandardDeviation[data]],
N[MeanDeviation[data]],N[MedianDeviation[data]],N[QuartileDeviation[data]],
N[Mean[data]],N[HarmonicMean[data]],N[Median[data]],N[Skewness[data]],
N[QuartileSkewness[data]],N[Kurtosis[data]],N[Quantile[data,0.25]],
N[Quantile[data,0.75]],Max[data],Min[data],drift[data]//N,volatility[data]//N,
autocorr[data]//N};fin=TableForm[medi,TableHeadings={{"Length","Variance",
"Standard Deviation","Mean Deviation","Median Deviation",
"Quartile Deviation","Mean","Harmonic Mean","Median","Skewness",
"Quartile Skewness",
"Kurtosis Excess","Second Quartile", "Fourth Quartile","Maximum","Minimum","Drift","Volatility","Autocorrelation"}, {""}}];Return[fin]];

Because closing stock prices are coincidentally sometimes the same from one day to the next, and since a number of these statistical functions will throw a division by zero error, we need to change all of our zero-valued log-delta observations to a number that’s marginally different than zero:

xzero=g_:=RandomReal[{0.0000000001,0.00000001}]/;g==0;
  

Next we construct a table of descriptive statistics for each of the equities in the portfolio:

Clear[descripTbl];descripTbl=TableForm[Transpose[{descripStatShort[lndxd/.rul]
[[1]],descripStatShort[lneght/.rul][[1]],descripStatShort[lniosp/.rul][[1]],
descripStatShort[lnmg/.rul][[1]],descripStatShort[lnmgam/.rul][[1]]}],
TableHeadings->{{"Length","Variance","Standard Deviation", "Mean Deviation",
"Median Deviation","Quartile Deviation","Mean", "Harmonic Mean","Median","Skewness",
"Quartile Skewness", "Kurtosis Excess","Second Quartile","Fourth Quartile","Maximum",
"Minimum","Drift","Volatility","Autocorrelation","Cowles-Jones Statistic"},
{"DXD","EGHT","IOSP","MG","MGAM"}}]
  

descrip_Tbl

We need to generate a correlation matrix for the asset log-deltas, and from the standard correlation matrix, we will generate a conditioned VCV matrix using the following Mathematica code:

Clear[matrixV];matrixV[corrmat_]:=Module[{evcmat,evlmat,inter,ret},
evcmat=Eigenvectors[corrmat];evlmat=DiagonalMatrix[Eigenvalues[corrmat]];
inter=evcmat.MatrixPower[evlmat,1/2].Transpose[evcmat];
ret=SetPrecision[inter,8];Return[ret]];
  

vcvtbl_032013

For purposes of portfolio simulation, it’s the conditioned VCV matrix – not the standard correlation matrix – that will govern the simulated portfolio’s codependency.

Before we start working with the VCV matrix, we need to ensure that it’s symmetric and positive definite. Mathematica internally uses Cholesky decomposition in a number of functions, and an input matrix that is not positive definite will cause Mathematica to throw an error. We can test for these matrix qualities like so:

vcvmat===Transpose[vcvmat]
PositiveDefiniteMatrixQ[vcvmat]
  

The latter is a new function introduced in version 9.

The price data we’ve collected starts with the first trading day in 2011. Based on that, it’s useful to construct a vector of average returns for each asset:

Clear[muvec];muvec=Mean[lnall];
  

For purposes of simplicity, we’ll assume that each of our holdings in the respective assets makes up one-fifth of total portfolio value. Total portfolio return is just the dot-product of muvec and a vector of portfolio weights:

Clear[portret];portret=muvec.Table[1/5,{5}];
  

Then to obtain the average annual return, just multiply portret by 252 (the usual number of trading days in a year).

Asset Marginal Distributions

It’s important to recognize that copula-based simulation will return values in the form of CDF probabilities, 1 > p > 0, for the specified multivariate distribution – not simulated observations, per se. Clearly, a 10^4 by 5 matrix of simulated Student’s T probabilities isn’t terribly useful for our purpose. The real power of copula-based simulation comes from applying the appropriate marginal distribution to the simulated CDF values for each asset.

As famously observed by Benoit Mandelbrot, daily changes in equity and commodity prices are not Normally distributed. In fact, they’re not even log-normally distributed. Accordingly, I typically model daily log-deltas in accordance with either the alpha-Stable distribution or the TsallisQ Gaussian distribution, whichever provides the better fit. Both of these distribution forms readily accommodate data sets that are both longer-tailed and higher-peaked than either the Normal or log-normal distributions:


compPDFs
The graphic above illustrates the comparative distribution PDFs with respect to EGHT’s daily log-returns.

Mathematica accomplishes distribution fitting with the following function:

stable=EstimatedDistribution[array,StableDistribution[1,α,β,δ,γ]]
tsallisq=EstimatedDistribution[array,TsallisQGaussianDistribution[μ,β],q]]
  

and thereafter we can test the returned distributions to see which is the better fit like so:

DistributionFitTest[array,#]&/@{stable,tsallisq}
  

For our test data, Mathematica returned the following best fits:

margDs

Student’s T Degrees of Freedom Parameter

A Student’s T copula function takes as inputs the multivariate correlation (VCV) matrix and the estimated degrees of freedom parameter, v (from the Student’s T distribution). Using maximum likelihood, we can estimate the appropriate value for v by maximizing the following function with respect to our VCV matrix and the variable, v:

Clear[tLLFn];tLLFn[v_,corrmat_]:=Module[{dim,ret},dim=Length[corrmat];
ret=-(Pi^(-dim/2)*v^(-1-dim/2)*Gamma[(dim+v)/2]*(dim+v*PolyGamma[0,v/2]
-v*PolyGamma[0,(dim+v)/2]))/(2*Sqrt[Det[corrmat]]*Gamma[v/2]);Return[ret]];
  

For the test data we’re using, Mathematica returns v = 9.

Ordinarily, we would take our VCV matrix and estimated value for v, and plug them into Mathematica’s built-in CopulaDistribution function. As I referenced above, however, the computation times for Monte Carlo simulation using this method are simply untenable. As a result, I’m particularly grateful to the member known as Sasha at mathematica.stackexchange.com for demonstrating how to use interpolation to derive the same simulation output in a fraction of the time.

To accomplish this, you can see from the linked page above that Sasha created a Fritsch-Carlson interpolation function, together with a grid system to hold the interpolated values. Obviously the grid Sasha designed can be adjusted for greater or lesser granularity, depending on user preference and purpose for the simulation. Thereafter, Sasha’s function applies a multivariate Student’s T distribution to the grid (which yields our simulated CDF values), and then it maps the respective marginal distribution to each asset’s CDF array.

The total elapsed time for Sasha’s operation is less than 5 seconds for a 10^4 × 5 simulation, as compared to 15 minutes or more using Mathematica’s explicit CopulaDistribution function.

Scrubbing the Simulated Data

One drawback of modeling with alpha-Stable and/or Tsallis Q Gaussian distributions is their infinite second moment (variance). In practice, it means that Monte Carlo simulation of these distributions will occasionally return values that are so far out of scale as to be unusable (in statistical terms, an “outlier”). Recalling from our descriptive statistics above that the maximum change in daily log-delta for any of the five equities was on the order of 0.4, you can see from the chart below that we did in fact generate some extreme observations:

rawsim

If we were simulating log-deltas for a single asset, the scrubbing process would be simple; we’d just remove any entry whose absolute value is greater than some assigned threshold. Since we’re simulating multivariate data, however, each 1 × 5 observation (or each day’s simulated return) is interdependent. Consequently, we need a way to test for multivariate outliers.

It turns out that there are several ways to accomplish this, but the mathematically simplest is the Mahalanobis distance method. We can implement the method in Mathematica like so:

Clear[scrubCop];scrubCop[rvraw_,ci_:0.975]:= 
Module[{md,dims,x2,twov,twov2,ret},dims=
Length[Transpose[rvraw]];md=mDist[rvraw]; 
x2=Quantile[ChiSquareDistribution[dims],ci]; 
twov=MapThread[Prepend,{rvraw,md}]; 
twov2=Complement[twov,Select[twov,#[[1]]>x2&]]; 
ret=Map[Drop[#,{1}]&,twov2];Return[ret]];

After scrubbing, the box and whiskers chart below shows our nicely distributed yet long-tailed simulations, free from multivariate outliers:

scrubdat

Posted in Finance & Investing, Mathematica | Leave a comment

Mathematica as Guitar Scratch Pad

Below is a block of Mathematica code that will generate and play the last few bars of Pat Metheny’s beautiful Solo from “More Travels”.  Just click the play button in the bottom left-hand corner of the graphic once the .CDF application loads.  Note that you will need to enable Javascript if it has been disabled. Please also note that some users have reported difficulty loading the application in Safari, but I’ve heard of no such difficulties with Firefox.

Continue reading “Mathematica as Guitar Scratch Pad” »
Posted in Mathematica, Music | Leave a comment

Rationalizing Popular Outré or: Why Would Anyone Commission Frank Gehry?

It’s more important to be good than original.

—Mies Van Der Rohe


At the Intersection of Art and Architecture, Where Major Collisions Frequently Occur

Some topics actually beg to be addressed.

Those of us who live in free societies accept non-controversially that art is completely entitled to be provocative, even designedly and purposefully so.  Indeed, some legitimately contend that this is art’s primary function (I respectfully disagree, but I digress).

Architecture, at least public* architecture, must satisfy a different standard.

* I use the term “public” herein in a commercial sense to denote places of public accommodation, regardless of who owns them or how they were financed.  In legal terms, I mean structures that are or will be subject to the public accommodations provisions of the Americans With Disabilities Act.

To be sure, much of architecture can be appreciated in the same way as sculpture or other three-dimensional artistic media.  But the aegis is critically different; architecture must always facilitate.

That means aesthetics, no matter how inspired, are but a single component of architectural merit.  More to the point, it means aesthetics must never diminish the utility of public architecture.

At the very core of architecture lie constraints: fiscal, commercial, physical, legal, natural.  And since unrecognized or unheeded constraints are satisfied only accidentally, the first step in architectural creation is and must be analysis – not inspiration.

These dual notions of utility and constraints necessarily inform architectural criticism.  Good architecture satisfies constraints and reconciles aesthetics with utility.  Great architecture leverages constraints and integrates them with aesthetics to provide excess utility.

One important measure of architectural utility is sociological.  Great public architecture unifies and elevates a community, even as great art – no less importantly – often factionalizes.  I like the above quotation from Mies, in part because it encapsulates perfectly the architect’s practical imperative.  Architects are rightfully judged by their ability to envision and then deliver utility, and this is a chief distinction between architects and artists.

Which brings me to Frank Gehry, and specifically the Cleveland Clinic’s Lou Ruvo Center in Las Vegas.


nuvo_center

Monumental irony.  The Cleveland Clinic Lou Ruvo Center for Brain Health, designed by Frank Gehry – a building dedicated to the study and eradication of Alzheimer’s disease and other degenerative brain disorders.


Birds Gotta Fly, Fish Gotta Swim

Let me emphasize that I have no quarrel with Frank Gehry, per se.  As an architect, he makes no classical pretensions; he does what he does, and his design propensities are well known.  On the subject of Gehry’s design, I’d merely point out that Albert Hofmann never advocated a daily regimen of his famous discovery; he only claimed that it opened up some interesting possibilities.

In the same way, I’m libertarian enough not only to wish Frank Gehry and his associates well with respect to residential and private commissions, but to absolve him from guilt for accepting design commissions for public buildings.  After all, he has a business to run.

My real beef is with whoever handed the hot mic to Yoko – again, and I’m particularly interested in what would motivate someone to do it.


This Just In: Men Go Crazy in Congregations

By googling the terms “emperor’s new clothes” and “phenomenon,” I discovered the following paper written by three colleagues at Cornell and published in the American Journal of Sociology.  It concerns the sometimes inexplicable popularity of the unpopular; for example, how might one explain it if, over time, music critics began to praise recordings that featured cacophonous rhythms with no melody?

According to the paper by Centola, Willer and Macy, this phenomenon cannot occur within a generalized and fully connected social network.  Popular unpopularity only takes root when agents within a developed local network create their own local preference and feedback loop for an unpopular norm, and thereafter export it to the generalized, connected social network.  To the extent that the generalized network attributes some special expertise to the local network, this of course greatly increases the probability of generalized acceptance.

If Centola et al. are correct, it means the AIA and the Pritzker Prize committee are as responsible as anyone for the Lou Ruvo Center and all of Gehry’s other aesthetic blasphemies.

It also means that the way to prevent future mis-coronation is for architects to connect regularly with people outside of the architecture community.  Actually show an architectural portfolio to people unaffiliated with architecture and ask them if it’s any good.  Listen to what they say and weigh their opinions; don’t automatically dismiss them as Philistines if their opinions run counter to the architecture community’s prevailing assessment.  To charitable benefactors and governmental decision makers: trust your eyes.  Above all, don’t let an architect or architecture critic convince you that naked is clothed.


Posted in Architecture, Behavioral Economics, Pet Peeves | Leave a comment

The 5 Best Head Coaching Jobs in College Football

To mark the occasion of a rare but seemingly imminent vacancy, we list the top 5 head coaching positions in all of college football.

Andy Staples of Sports Illustrated compiled a somewhat similar list in an article last year, but that list didn’t take into account many of the all-important intangibles that sometimes mean the difference between satisfaction in a job, and surfing monster.com every night.

Once intangibles come into play, five jobs stand above the rest.

Winning tradition obviously matters, but no weight is given to current W-L record or even a school’s winning percentage trend.  The top 5 head coaching jobs would still be the top 5 even if some or all of the schools were winless right now.  That may seem antithetical, but it doesn’t necessarily follow that each of the five incumbents is doing a good job maximizing their opportunity.

Also, since no school is immune, I give no weight to the effect of NCAA sanctions, whether threatened or imposed.  The sole exception to this is Penn State, due to the uncertain long-term effects of truly draconian sanctions, combined with post-Sandusky fallout.  That’s a significant exception because Penn State would otherwise be a borderline top 5 job given the school’s historical recruiting dominance not only in Pennsylvania, but all along the Eastern seaboard from Maryland to Maine – something no other B1G school can boast.

On that same note, in assessing desirability we start from the premise that unless you’re Bill Snyder, better players make better coaches.  That means the most desirable head coaching jobs all enjoy inherent and sometimes unique advantages with respect to recruiting.

We also start from the idea that, all things being equal, the probability that a given high school player will sign with a particular school is most importantly a function of proximity.  Of course things are seldom equal, so the most desirable head coaching jobs are at schools that have a time-tested approach to overcoming resistance to distance.  Taken together, it means the most desirable head coaching jobs are in geographic areas replete with indigenous high school talent, but more narrowly, at schools that have also managed to create an inherent appeal to high school kids.

It turns out that the inherent appeal of a school can be predicted in accordance with certain characteristics.  In that vein, before we identify the top 5 head coaching gigs, it’s helpful to note what factors not only tie these jobs together, but also separate them from the rest of the pack:

  •     A solid base of wealthy, powerful, invested alumni
  •     A national brand, including iconic uniforms that change rarely, if ever
  •     Longstanding winning tradition
  •     Excellent athletic facilities
  •     Top-tier pay scales for the head coach and assistants
  •     A track record of putting players in the NFL (Mel Kiper, if awakened from REM sleep, could instantly recite the 3-deep roster for any of these 5 schools, and NFL scouts can find their way around all 5 campuses without a map)
  •     A fully rationalized and consistent recruiting pitch
  •     A fully developed self-identity (these schools know who they are and who they aren’t)
Not surprisingly, there aren’t many cons that go with the top 5 jobs, but one shared drawback is very high fanbase expectations.  Bowl eligibility for these teams is simply assumed.  Indeed, the more rational and moderate fans of these five schools want to see their team in a BCS bowl more years than not, and in the National Championship game at least once or twice a decade.

One final point before we get to the rankings.

Academics is a two-edged sword, regardless of where a school sits on the academic spectrum.  Universities with stronger-than-average academic reputations have a better chance of recruiting and successfully matriculating offensive linemen and the occasional skill position player with a 1400 SAT.  Yet those same universities may be somewhat disadvantaged with respect to kids who view college as nothing but an interim stop and an unavoidable obstacle to reaching the NFL (regardless of whether the player has a realistic chance of someday making an NFL roster).  Naturally, for universities with average to weaker-than-average academic reputations, the converse may be true.


1.    Southern Cal

Pros

Hollywood

Let’s face it.  An Atlanta kid can be holding scholarship offers from every SEC school and Florida State, but if he gets an offer from Southern Cal his eyes will light up.

Song Girls

Never mind that all of them are dating Newport Beach investment bankers, on TV they manage to look somehow attainable.

National Recruiting Appeal

Southern Cal is the only school whose recruiting pitch is as strong on the other side of the country as it is in their own backyard.

Private Institution

This is a seldom-mentioned but tremendously important advantage.  Any disclosure policies applicable to Southern Cal football are likely self-imposed.  Among lots of other things, that means unpleasant or potentially embarrassing discussions can take place behind firmly closed doors.  It also means Southern Cal is not beholden to the California legislature for funding, nor subject to the caprice of a UCLA-alumnus governor.

First Pick of California High School Players

Were it any other school, this would be listed at the top.

Weather

Arguably the best weather on the planet.

Cons

South Central Los Angeles

To clear up any misunderstanding, the USC campus isn’t actually in Hollywood.

Smaller, Less Engaged Fanbase

Except for the Lakers, Los Angeles isn’t much of a spectator sports town.

L.A. Coliseum

Southern Cal is the only school in top 5 without an on-campus football stadium.

Pacific Time Zone

It makes no discernible difference with respect to recruiting, but the East Coast will have gone to bed before Southern Cal’s night games are over.

2.    LSU

Pros

First Pick of Louisiana High School Players

With very few exceptions.  The significance of this is difficult to overstate.  Louisiana high schools play an outstanding brand of football; certainly on par with Texas and Florida, and maybe even better when you adjust for the smaller population.  In addition, Louisiana kids who hold a scholarship offer from LSU face significant pressure from the community to stay in-state.

Strong Recruiting All Along the Gulf Coast

The LSU program is known and respected by high school players and coaches from East Texas to Florida.  LSU takes a back seat to no one in terms of recruiting prowess in the South.

Louisiana Has Only One AQ School

Unlike Texas, Mississippi, Alabama, Georgia, Florida, Tennessee, etc.

Proximity to New Orleans

Any game at the Superdome – currently the site of the Sugar Bowl and 1 in 4 National Championship games – is essentially a home game.

Cons

Baton Rouge

Not much pizzazz, but again it’s driving distance to New Orleans.

SEC West Schedule

Brutal, and will remain so for the foreseeable future.

Weather

Very hot and very muggy.

3. (Tie) Texas

Pros

Wealth

In terms of endowment, Texas is the wealthiest university not named Harvard in the United States.

Longhorn Network

Mack Brown’s recent comments notwithstanding, the LHN represents a tremendous advantage.  It alone caused Texas A&M to jump, indignant, from the frying pan straight into the fire.

Dominant Institution in the Big 12

It’s hard to get anything done inside the Big 12 unless Texas goes along.

First Pick of Texas High School Players

With exceptions.  Some Texas kids grow up dreaming of playing for Oklahoma or Texas A&M.  A lesser number grow up dreaming of playing for Baylor.  And that’s about it.

Unparalleled Facilities

To characterize Texas’ facilities as “excellent” is an understatement.  For one thing, the Longhorns’ locker room at Darrell K. Royal Memorial Stadium is orders of magnitude nicer than the Dallas Cowboys’ locker room at Jerry World.

Austin

In fairness, the manifold attractions of Austin are mainly the kind of things that appeal to 30-something professionals.  It’s much less clear whether these are really difference-makers to high school kids.

Cons

Winning Isn’t Enough

Being the head coach at Texas is very much akin to holding state office, in that a mind-numbing litany of protocols and political obligations come with the job.  Further, the head coach’s ability to satisfy these is as determinative to his success as the team’s W-L record.

Self-Imposed Academic Restrictions

More than could be adequately covered in the general academics discussion above.  Without mentioning names, there have been any number of truly outstanding Texas high school players who Texas wouldn’t recruit due to the school’s own academic concerns.

Big 12 Uncertainty

Who knows what the future holds for the Big 12, but the conference certainly doesn’t feel stable.  Not like the B1G and SEC feel stable.

Goliath Perception Within Texas

Which, incidentally, doesn’t apply to Oklahoma or Texas A&M.  As Wilt Chamberlain famously lamented, “No one roots for Goliath.”  True that, regardless of how many University of Gath alumni there are in Dallas and Houston.

No Recruiting Footprint Outside of Texas

Other than marginal inroads into Arizona and Colorado, Texas has little experience with the unique challenges that accompany recruiting out-of-state.

Weather

Summer is ridiculously hot, to the point that kids visiting from Florida notice it.

3. (Tie) Oklahoma

Pros

Winning Is Enough

Listed first because this factor alone nullifies any advantage the Texas job might otherwise enjoy by virtue of the Longhorn Network or anything else.  When I say “winning is enough,” I mean that Bob Stoops (or whoever has the Oklahoma job) doesn’t always have to conduct himself like he’s running for re-election.  This is something that can only be fully appreciated after you’ve had to do things the other way.

First Pick of Oklahoma High School Players

As with Texas, there are always a few exceptions.  Oklahoma’s is not nearly as significant as LSU’s home-state recruiting advantage, but Oklahoma high schools turn out a strong group of football players every year.

Second Pick of Texas High School Players

Solidly second.  Oklahoma picks before Texas A&M or anyone else besides Texas.

Targeted Out-Of-State Recruiting

Besides the state of Texas, Oklahoma has made solid inroads into Las Vegas (including traditional powerhouse Bishop Gorman), and has also made inroads into California.

Cons

Dependence on Texas

The school and the state.  With respect to the state, Oklahoma simply can’t win consistently without Texas high school players.  With respect to the school, how dependent is it?  Put it this way: each of the other three schools (Oklahoma, Oklahoma State, and Texas Tech) that were invited to join the Pac-12 along with Texas would be clearly and demonstrably better off today if the proposed move had gone through.  Unfortunately for those schools, the Pac-12 wasn’t interested sans Texas.

Big 12 Uncertainty

See Texas.

Norman and Oklahoma City

The NBA Thunder notwithstanding, these towns have even less pizzazz than Baton Rouge.  They’re also more than a comfortable drive away from any large city.

Weather

See Texas.

5.  Notre Dame

Pros

Unmatched National Fan Base

Notre Dame’s strong contingent of alumni and subway alumni are everywhere.

Sui Generis

The exclusive deal with NBC is one example; Notre Dame’s BCS eligibility criteria is another.  Notre Dame is recognized as unique throughout the world of college football.  This of course greatly streamlines the process when Notre Dame inevitably requests special treatment.

Private Institution

See Southern Cal.

Chicago’s Team

With all due respect to the University of Illinois and Northwestern, Chicago’s college football team is Notre Dame.  Although Southern Cal comes closest, no other school dominates one of the ten largest U.S. cities in the same way.

Unmatched Tradition

Notre Dame has produced more Heisman trophy winners, by a comfortable margin, than any other school.  The Four Horsemen.  Knute Rockne.  Frank Leahy.  Ara Parseghian.  If kids today don’t recognize those names, that’s on us.

Cons

National Recruiting Imperative

It’s highly doubtful whether Notre Dame could win consistently with only Midwestern kids.

Self-Imposed Academic Restrictions

See Texas, but to a greater degree.

South Bend

Although it’s reasonably close to Chicago.

Weather

Three words: Cold. Snowy. April.

Best of the Rest

Each of the five head-coaching jobs above represent the whole enchilada.  All other head-coaching jobs involve more, and more significant, tradeoffs, which means personal preference takes a role in determining which head coaching job is better than some other.

Listed below in no particular order is a non-exclusive collection of next-tier head coaching jobs, with truncated pros and cons:

Stanford

Unparalleled academics, but only so many high school football players can score a 1400 on the SAT.  This makes it hard to build a quality starting lineup, let alone depth.  Great weather.  Difficult to win big without a Plunkett, Elway, or Luck at QB, but since 8-5 is considered a good season, a head coach doesn’t have to win big at Stanford to stay as long as he wants.

Michigan

As with Notre Dame, it’s doubtful whether Michigan can win consistently with only Midwestern players.  Other drawbacks include weather, as well as the B1G’s ongoing and protracted decline.

Ohio State

See Michigan.  Oh, all right.

As with Notre Dame, it’s doubtful whether Michigan Ohio State can win consistently with only Midwestern players.  Other drawbacks include weather, as well as the B1G’s ongoing and protracted decline.

Alabama, Florida, Georgia, Texas A&M

Intense competition for in-state and area recruits.  Brutal SEC schedule.

Florida State

Intense competition for in-state and area recruits.  The ACC is first and foremost a basketball conference.

Oregon

Impossible to win consistently with only Oregon high school players.  Phil Knight’s patronage is a big positive.  Oregon is unquestionably hot right now, but its winning tradition is scant, and Oregon emits an unmistakable flavor-of-the-month quality.  While Eugene is a nice college town, it is very remote.

Posted in Business, College Football, Sports | Leave a comment

Ode to a Chipotle Franchise

Why do I love thee, Chipotle Mexican Grill (CMG)?  Let me count the reasons:

  1. Low food cost.  In the restaurant game, alchemy is turning rice and beans into a $5 entree.  Chipotle has deliberately selected an inventory of prepared food items to combinatorically maximize its capacity to satisfy customer taste.  At the same time, almost everything it makes begins with ultra low-cost rice and beans.  Vegetarian?  No problem; the black beans aren’t seasoned with meat, and you can choose from any of the grilled vegetables, picos and salsas.  Brilliant, just brilliant.
  2. Customer-centric focus.  By now I suspect everybody knows that Chipotle will make whatever they have the ingredients for, whether it’s on the menu or not.  Never tried it?  Order barbacoa quesadillas next time.
  3. Made to order, fast.  Chipotle’s service and assembly line means you don’t have to eat something that’s been sitting under a heat lamp, and you don’t have to scrape off the raw onions (or anything else you don’t like) before you can eat.  At the same time, you don’t have to take a number or wait at a table for someone to bring it to you, either.
  4. Spare, post-modern decor.  I’ll own this as my own personal architectural preference, but it also helps to keep the finish-out cost down.
  5. Technologically current.  Order ahead with Chipotle’s iPhone app.
Given Chipotle’s value proposition and service model, it’s no wonder that CMG’s shares have increased in value by 20% since the first of this year, and 155% since January 1, 2010.

Parenthetically, since the topic is best practices with respect to fast food, I’d be remiss if I didn’t also commend Five Guys.  Who would have thought that a fanatical emphasis on freshness – and complementary peanuts – would get you SRO patronage with a menu limited to hamburgers, french fries, and soft drinks?


Posted in Business | Leave a comment

5 Things the Apple Store Gets Wrong

Earlier this month, Guy Kawasaki posted on his blog an article entitled “10 Things You Can Learn from the Apple Store.”  He then went on to praise the Apple retail model in ten different respects. Now Guy’s analysis is normally spot-on.  In this case, however, I’d disagree with Guy in that Apple Stores (at least the ones I’ve visited) are just as likely to do things wrong as right. Accordingly, here’s my list of 5 Things the Apple Store Gets Wrong, or, 5 Reasons Why I Avoid the Apple Store Whenever Possible.

  1. Queues aren’t inherently bad.  I’ve literally lost count of how many times I: (a) went into the Apple Store to buy something specific; (b) found it; and then (c) couldn’t find an employee who would take my money.  Sometimes I persisted until I found someone, and sometimes I put the item back and walked out of the store in disgust.  Look, I’m as right-brained as anybody, but queuing came about because it’s the most efficient way to process lots of transactions in one location.  Besides, it’s not like Ridley Scott will take his commercial back if Apple asks its customers to stand in line to pay.
  2. Creativity and chaos sometimes go together, but ultimately they’re two different things.  This relates back to number 1.  When I look at the layout of an Apple Store, unfortunately I don’t see Pablo Picasso, Richard Feynman, or Mies Van Der Rohe.  I don’t even see Andy Warhol.  What I see is chaos.  I see lots of employees talking and milling around, and none of them available – let alone eager – to help the customer who already knows what he wants and just wants to pay for it.
  3. Stop needlessly increasing labor cost by assigning an employee to stand at the front of the store and greet people. Especially if that employee doesn’t carry a credit card reader.  The Apple Store isn’t Wal-Mart, and Apple users are savvy enough to recognize a perfunctory greeting when they hear one.  This is a case where Apple can have its cake and eat it; take your greeters and put them behind a cash register.  Or at a minimum, give them a card reader and an unambiguous admonition to use it.
  4. If digital media that’s manufactured by Apple, Inc. isn’t available for download at apple.com, you’d damn sure better stock plenty of it at the Apple Store.  I was late to the party in upgrading from Leopard to Snow Leopard, and by the time I got around to it, Snow Leopard (i.e., Mac OS version 10.6) was no longer available for download.  No problem; I went to the Apple Store to pick it up.  Not only did the Apple Store not stock it, the Apple Store’s employee actually sent me home to order it by snail mail.  Come on, Apple.  This is digital media we’re talking about, not AppleTV.  If you don’t want to stock the physical discs, at least fix it so that Apple Store employees can burn the media to disc for the customer on request.
  5. Since Apple is apparently indifferent to sales per square foot, they should make the floor space 25% bigger.  Guy praises Apple in his article for avoiding clutter.  I guess it depends on how you define the term.  I’ll concede that Apple Stores don’t feature stacks of inventory everywhere you look.  But if the narrow aisles between display tables are so full of people that you can’t get to the item that you went there to buy, what difference does it make why?
Obviously, none of the foregoing detracts from the quality and superiority of the Mac OS.  But for all of Apple’s magnificence in other areas of its business, to contend that Apple’s haphazard retail model also represents best practice is a little rich.


Posted in Business, Pet Peeves | Leave a comment

The Mathematics of Stop-and-Go Traffic

The June 2010 issue of Wired magazine featured an algorithm developed at the University of Alberta and MIT for predicting the onset of traffic jams (disregarding extraneous road hazards).

Assume a hypothetical mile-long stretch of road on some interstate. The Wired algorithm claims that we can calculate the minimum traffic density at which a traffic jam could be expected to occur thusly:

minRhoJam = \frac{maxRho}{2} \left ( 1-\sqrt{1-\frac{4 \: \beta \: 100 }{\: vOpt^{2}}} \right )

where maxRho is the maximum density at which traffic could be expected to move unimpeded (albeit slowly); β × 100 is a coefficient describing the quality of the road (0.01 is a dirt road full of potholes, and 0.99 is a NASCAR oval); and vOpt is the maximum velocity at which a reasonably prudent driver would travel if otherwise unimpeded.

Let’s assume that under absolutely optimal conditions on our hypothetical road a maximum of 80 vehicles could move slowly, yet continuously and without stopping. We further assume the road quality merits a 0.7 rating, and that a reasonable person would travel up to 65 m.p.h. over our stretch of road unless forced to brake.

With these inputs, the Wired algorithm returns 38.97, meaning that we may expect traffic jams to occur at any density greater than 39 vehicles.

That’s nice to know, but like any nonlinear function, this one only becomes really useful when we determine the influence of the respective variables. We accomplish this by making each variable stochastic (i.e., variable), and then taking the first derivative of the minRhoJam function with respect to each of the three variables.  Thereafter, we plot the derivative functions and examine the relative contribution of each.

Now, Excel doesn’t do calculus very enthusiastically, and it won’t do calculus at all unless you either use Solver, or assign a numeric value to everything.

Mathematica, however, does calculus naturally, so we can use Mathematica to symbolically characterize the first derivative of the minimum density function with respect to each of the variables. Thereafter, we can assign numeric values as we wish and plot the results.

Mathematica shows that the first derivative of the minRhoJam function w.r.t. maxRho is: \frac{1}{2}\, \left ( 1-\sqrt{1-\frac{400\, \beta }{vOpt^{2}}} \right ), which plotted looks like this:



The first derivative of minRhoJam w.r.t. β is: \frac{100\, maxRho}{vOpt^{2}\: \sqrt{1-\frac{400\, \beta }{vOpt^{2}}}}.



And the first derivative of minRhoJam w.r.t. vOpt is: -\frac{200\, maxRho\, \beta }{vOpt^{2}\: \sqrt{1-\frac{400\, \beta }{vOpt^{2}}}}.



The first two plots are linear and unremarkable, in that positive changes to the stochastic variable increase the density at which we can expect traffic jams to occur.  The third plot, however, is both non-linear and negatively sloped.  In other words, the greater the speed at which a reasonable driver would travel, the lesser the density at which traffic jams can be expected.

One explanation is that vOpt is at least partly a function of driver expectations. The better the road, the faster (up to a point) an average driver expects to travel unimpeded. Naturally, these expectations are founded on an assumption that other drivers think the way they do, and more to the point, drive the same way they do.

If only.

Seen anybody creeping along – oblivious – in the left-hand lane of the interstate lately?  My own informal survey indicates that most of these drivers are distracted by some exigent need to talk on their cell phones (with no hands-free device).  Of the others, I imagine some are passive-aggressive, and rest are just plain ignorant.  Either way, the recalcitrance of these drivers causes braking that would otherwise be unnecessary, which in turn invalidates the appropriate driving speed assumptions of other drivers.

So what’s the takeaway?  Once more: the left-hand lane of any freeway is for passing only, even if there are no posted signs to that effect.


Posted in Mathematica, Pet Peeves | Leave a comment