Tag Archives: IBM

7000 Words About The Dubious Refragmentation Of The Economy

One advantage of being old is that you can see change happen in your lifetime.

A lot of the change I’ve seen is fragmentation. For example, US politics and now Canadian politics are much more polarized than they used to be. Culturally we have ever less common ground and though inclusiveness is preached by the media and the Left, special interest groups and policies have a polarizing effect. The creative class flocks to a handful of happy cities, abandoning the rest. And increasing economic inequality means the spread between rich and poor is growing too. I’d like to propose a hypothesis: that all these trends are instances of the same phenomenon. And moreover, that the cause is not some force that’s pulling us apart, but rather the erosion of forces that had been pushing us together.

Worse still, for those who worry about these trends, the forces that were pushing us together were an anomaly, a one-time combination of circumstances that’s unlikely to be repeated—and indeed, that we would not want to repeat.

Describe How a Mass Culture Developed in America - JeankruwHumphrey

The two forces were war (above all World War II), and the rise of large corporations.

The effects of World War II were both economic and social. Economically, it decreased variation in income. Like all modern armed forces, America’s were socialist economically. From each according to his ability, to each according to his need. More or less. Higher ranking members of the military got more (as higher ranking members of socialist societies always do), but what they got was fixed according to their rank. And the flattening effect wasn’t limited to those under arms, because the US economy was conscripted too. Between 1942 and 1945 all wages were set by the National War Labor Board. Like the military, they defaulted to flatness. And this national standardization of wages was so pervasive that its effects could still be seen years after the war ended. [1]

Business owners weren’t supposed to be making money either.

FDR said “not a single war millionaire” would be permitted. To ensure that, any increase in a company’s profits over prewar levels was taxed at 85%. And when what was left after corporate taxes reached individuals, it was taxed again at a marginal rate of 93%. [2]

Socially too the war tended to decrease variation. Over 16 million men and women from all sorts of different backgrounds were brought together in a way of life that was literally uniform. Service rates for men born in the early 1920s approached 80%. And working toward a common goal, often under stress, brought them still closer together.

Though strictly speaking World War II lasted less than 4 years for the USA, its effects lasted longer and cycled North towards Canada.

Wars make central governments more powerful, and World War II was an extreme case of this. In the US, as in all the other Allied countries, the federal government was slow to give up the new powers it had acquired. Indeed, in some respects the war didn’t end in 1945; the enemy just switched to the Soviet Union. In tax rates, federal power, defense spending, conscription, and nationalism the decades after the war looked more like wartime than prewar peacetime. [3] And the social effects lasted too. The kid pulled into the army from behind a mule team in West Virginia didn’t simply go back to the farm afterward. Something else was waiting for him, something that looked a lot like the army.

If total war was the big political story of the 20th century, the big economic story was the rise of new kind of company. And this too tended to produce both social and economic cohesion. [4]

The 20th century was the century of the big, national corporation. General Electric, General Foods, General Motors. Developments in finance, communications, transportation, and manufacturing enabled a new type of company whose goal was above all scale. Version 1 of this world was low-res: a Duplo world of a few giant companies dominating each big market. [5]

The late 19th and early 20th centuries had been a time of consolidation, led especially by J. P. Morgan. Thousands of companies run by their founders were merged into a couple hundred giant ones run by professional managers. Economies of scale ruled the day. It seemed to people at the time that this was the final state of things. John D. Rockefeller said in 1880

Image result for john d rockefeller

The day of combination is here to stay. Individualism has gone, never to return.

He turned out to be mistaken, but he seemed right for the next hundred years.

The consolidation that began in the late 19th century continued for most of the 20th. By the end of World War II, as Michael Lind writes, “the major sectors of the economy were either organized as government-backed cartels or dominated by a few oligopolistic corporations.”

For consumers this new world meant the same choices everywhere, but only a few of them. When I grew up there were only 2 or 3 of most things, and since they were all aiming at the middle of the market there wasn’t much to differentiate them.

One of the most important instances of this phenomenon was in TV.

Popular culture and daily life of Americans in the 1950s - WWJD

Here there were 3 choices: NBC, CBS, and ABC. Plus public TV for eggheads and communists (jk). The programs the 3 networks offered were indistinguishable. In fact, here there was a triple pressure toward the center. If one show did try something daring, local affiliates in conservative markets would make them stop. Plus since TVs were expensive whole families watched the same shows together, so they had to be suitable for everyone.

And not only did everyone get the same thing, they got it at the same time. It’s difficult to imagine now, but every night tens of millions of families would sit down together in front of their TV set watching the same show, at the same time, as their next door neighbors. What happens now with the Super Bowl used to happen every night. We were literally in sync. [6]

In a way mid-century TV culture was good. The view it gave of the world was like you’d find in a children’s book, and it probably had something of the effect that (parents hope) children’s books have in making people behave better. But, like children’s books, TV was also misleading. Dangerously misleading, for adults. In his autobiography, Robert MacNeil talks of seeing gruesome images that had just come in from Vietnam and thinking, we can’t show these to families while they’re having dinner.

I know how pervasive the common culture was, because I tried to opt out of it, and it was practically impossible to find alternatives.

When I was 13 I realized, more from internal evidence than any outside source, that the ideas we were being fed on TV were crap, and I stopped watching it. [7] But it wasn’t just TV. It seemed like everything around me was crap. The politicians all saying the same things, the consumer brands making almost identical products with different labels stuck on to indicate how prestigious they were meant to be, the balloon-frame houses with fake “colonial” skins, the cars with several feet of gratuitous metal on each end that started to fall apart after a couple years, the “red delicious” apples that were red but only nominally apples. And in retrospect, it was crap. [8]

But when I went looking for alternatives to fill this void, I found practically nothing. There was no Internet then. The only place to look was in the chain bookstore in our local shopping mall. [9] There I found a copy of The Atlantic. I wish I could say it became a gateway into a wider world, but in fact I found it boring and incomprehensible. Like a kid tasting whisky for the first time and pretending to like it, I preserved that magazine as carefully as if it had been a book. I’m sure I still have it somewhere. But though it was evidence that there was, somewhere, a world that wasn’t red delicious, I didn’t find it till college.

It wasn’t just as consumers that the big companies made us similar. They did as employers too. Within companies there were powerful forces pushing people toward a single model of how to look and act. IBM was particularly notorious for this, but they were only a little more extreme than other big companies. And the models of how to look and act varied little between companies. Meaning everyone within this world was expected to seem more or less the same. And not just those in the corporate world, but also everyone who aspired to it—which in the middle of the 20th century meant most people who weren’t already in it. For most of the 20th century, working-class people tried hard to look middle class. You can see it in old photos. Few adults aspired to look dangerous in 1950.

But the rise of national corporations didn’t just compress us culturally. It compressed us economically too, and on both ends.

Along with giant national corporations, we got giant national labor unions. And in the mid 20th century the corporations cut deals with the unions where they paid over market price for labor. Partly because the unions were monopolies. [10] Partly because, as components of oligopolies themselves, the corporations knew they could safely pass the cost on to their customers, because their competitors would have to as well. And partly because in mid-century most of the giant companies were still focused on finding new ways to milk economies of scale. Just as startups rightly pay AWS a premium over the cost of running their own servers so they can focus on growth, many of the big national corporations were willing to pay a premium for labor. [11]

As well as pushing incomes up from the bottom, by overpaying unions, the big companies of the 20th century also pushed incomes down at the top, by underpaying their top management. Economist J. K. Galbraith wrote in 1967 that “There are few corporations in which it would be suggested that executive salaries are at a maximum.” [12]

Speaking Out Meant Standing Alone

To some extent this was an illusion.

Much of the de facto pay of executives never showed up on their income tax returns, because it took the form of perks. The higher the rate of income tax, the more pressure there was to pay employees upstream of it. (In the UK, where taxes were even higher than in the US, companies would even pay their kids’ private school tuitions.) One of the most valuable things the big companies of the mid 20th century gave their employees was job security, and this too didn’t show up in tax returns or income statistics. So the nature of employment in these organizations tended to yield falsely low numbers about economic inequality. But even accounting for that, the big companies paid their best people less than market price. There was no market; the expectation was that you’d work for the same company for decades if not your whole career. [13]

Your work was so illiquid there was little chance of getting market price. But that same illiquidity also encouraged you not to seek it. If the company promised to employ you till you retired and give you a pension afterward, you didn’t want to extract as much from it this year as you could. You needed to take care of the company so it could take care of you. Especially when you’d been working with the same group of people for decades. If you tried to squeeze the company for more money, you were squeezing the organization that was going to take care of them. Plus if you didn’t put the company first you wouldn’t be promoted, and if you couldn’t switch ladders, promotion on this one was the only way up. [14]

To someone who’d spent several formative years in the armed forces, this situation didn’t seem as strange as it does to us now. From their point of view, as big company executives, they were high-ranking officers. They got paid a lot more than privates. They got to have expense account lunches at the best restaurants and fly around on the company’s Gulfstreams. It probably didn’t occur to most of them to ask if they were being paid market price.

The ultimate way to get market price is to work for yourself, by starting your own company. That seems obvious to any ambitious person now. But in the mid 20th century it was an alien concept. Not because starting one’s own company seemed too ambitious, but because it didn’t seem ambitious enough. Even as late as the 1970s, when I grew up, the ambitious plan was to get lots of education at prestigious institutions, and then join some other prestigious institution and work one’s way up the hierarchy. Your prestige was the prestige of the institution you belonged to. People did start their own businesses of course, but educated people rarely did, because in those days there was practically zero concept of starting what we now call a startup: a business that starts small and grows big. That was much harder to do in the mid 20th century. Starting one’s own business meant starting a business that would start small and stay small. Which in those days of big companies often meant scurrying around trying to avoid being trampled by elephants. It was more prestigious to be one of the executive class riding the elephant.

By the 1970s, no one stopped to wonder where the big prestigious companies had come from in the first place.

Famous 1970s Logos: The Best 70s Logo Design Examples

It seemed like they’d always been there, like the chemical elements. And indeed, there was a double wall between ambitious kids in the 20th century and the origins of the big companies. Many of the big companies were roll-ups that didn’t have clear founders. And when they did, the founders didn’t seem like us. Nearly all of them had been uneducated, in the sense of not having been to college. They were what Shakespeare called rude mechanicals. College trained one to be a member of the professional classes. Its graduates didn’t expect to do the sort of grubby menial work that Andrew Carnegie or Henry Ford started out doing. [15]

And in the 20th century there were more and more college graduates. They increased from about 2% of the population in 1900 to about 25% in 2000. In the middle of the century our two big forces intersect, in the form of the GI Bill, which sent 2.2 million World War II veterans to college. Few thought of it in these terms, but the result of making college the canonical path for the ambitious was a world in which it was socially acceptable to work for Henry Ford, but not to be Henry Ford. [16]

I remember this world well. I came of age just as it was starting to break up. In my childhood it was still dominant. Not quite so dominant as it had been. We could see from old TV shows and yearbooks and the way adults acted that people in the 1950s and 60s had been even more conformist than us. The mid-century model was already starting to get old. But that was not how we saw it at the time. We would at most have said that one could be a bit more daring in 1975 than 1965. And indeed, things hadn’t changed much yet.

But change was coming soon.

And when the Duplo economy started to disintegrate, it disintegrated in several different ways at once. Vertically integrated companies literally dis-integrated because it was more efficient to. Incumbents faced new competitors as (a) markets went global and (b) technical innovation started to trump economies of scale, turning size from an asset into a liability. Smaller companies were increasingly able to survive as formerly narrow channels to consumers broadened. Markets themselves started to change faster, as whole new categories of products appeared. And last but not least, the federal government, which had previously smiled upon J. P. Morgan’s world as the natural state of things, began to realize it wasn’t the last word after all.

What J. P. Morgan was to the horizontal axis, Henry Ford was to the vertical. He wanted to do everything himself. The giant plant he built at River Rouge between 1917 and 1928 literally took in iron ore at one end and sent cars out the other. 100,000 people worked there. At the time it seemed the future. But that is not how car companies operate today. Now much of the design and manufacturing happens in a long supply chain, whose products the car companies ultimately assemble and sell. The reason car companies operate this way is that it works better. Each company in the supply chain focuses on what they know best. And they each have to do it well or they can be swapped out for another supplier.

Why didn’t Henry Ford realize that networks of cooperating companies work better than a single big company?

One reason is that supplier networks take a while to evolve. In 1917, doing everything himself seemed to Ford the only way to get the scale he needed. And the second reason is that if you want to solve a problem using a network of cooperating companies, you have to be able to coordinate their efforts, and you can do that much better with computers. Computers reduce the transaction costs that Coase argued are the raison d’etre of corporations. That is a fundamental change.

In the early 20th century, big companies were synonymous with efficiency. In the late 20th century they were synonymous with inefficiency. To some extent this was because the companies themselves had become sclerotic. But it was also because our standards were higher.

It wasn’t just within existing industries that change occurred. The industries themselves changed. It became possible to make lots of new things, and sometimes the existing companies weren’t the ones who did it best.

Microcomputers are a classic example.

Ms Dos 1.25 (1982)(Microsoft) Game

The market was pioneered by upstarts like Apple, Radio Shack and Atari. When it got big enough, IBM decided it was worth paying attention to. At the time IBM completely dominated the computer industry. They assumed that all they had to do, now that this market was ripe, was to reach out and pick it. Most people at the time would have agreed with them. But what happened next illustrated how much more complicated the world had become. IBM did launch a microcomputer. Though quite successful, it did not crush Apple. But even more importantly, IBM itself ended up being supplanted by a supplier coming in from the side—from software, which didn’t even seem to be the same business. IBM’s big mistake was to accept a non-exclusive license for DOS. It must have seemed a safe move at the time. No other computer manufacturer had ever been able to outsell them. What difference did it make if other manufacturers could offer DOS too? The result of that miscalculation was an explosion of inexpensive PC clones. Microsoft now owned the PC standard, and the customer. And the microcomputer business ended up being Apple vs Microsoft.

Basically, Apple bumped IBM and then Microsoft stole its wallet. That sort of thing did not happen to big companies in mid-century. But it was going to happen increasingly often in the future.

Change happened mostly by itself in the computer business. In other industries, legal obstacles had to be removed first. Many of the mid-century oligopolies had been anointed by the federal government with policies (and in wartime, large orders) that kept out competitors. This didn’t seem as dubious to government officials at the time as it sounds to us. They felt a two-party system ensured sufficient competition in politics. It ought to work for business too.

Gradually the government realized that anti-competitive policies were doing more harm than good, and during the Carter administration started to remove them.

The word used for this process was misleadingly narrow: deregulation. What was really happening was de-oligopolization. It happened to one industry after another. Two of the most visible to consumers were air travel and long-distance phone service, which both became dramatically cheaper after deregulation.

Deregulation also contributed to the wave of hostile takeovers in the 1980s. In the old days the only limit on the inefficiency of companies, short of actual bankruptcy, was the inefficiency of their competitors. Now companies had to face absolute rather than relative standards. Any public company that didn’t generate sufficient returns on its assets risked having its management replaced with one that would. Often the new managers did this by breaking companies up into components that were more valuable separately. [17]

Version 1 of the national economy consisted of a few big blocks whose relationships were negotiated in back rooms by a handful of executives, politicians, regulators, and labor leaders. Version 2 was higher resolution: there were more companies, of more different sizes, making more different things, and their relationships changed faster. In this world there were still plenty of back room negotiations, but more was left to market forces. Which further accelerated the fragmentation.

It’s a little misleading to talk of versions when describing a gradual process, but not as misleading as it might seem. There was a lot of change in a few decades, and what we ended up with was qualitatively different. The companies in the S&P 500 in 1958 had been there an average of 61 years. By 2012 that number was 18 years. [18]

The breakup of the Duplo economy happened simultaneously with the spread of computing power. To what extent were computers a precondition? It would take a book to answer that. Obviously the spread of computing power was a precondition for the rise of startups. I suspect it was for most of what happened in finance too. But was it a precondition for globalization or the LBO wave? I don’t know, but I wouldn’t discount the possibility. It may be that the refragmentation was driven by computers in the way the industrial revolution was driven by steam engines. Whether or not computers were a precondition, they have certainly accelerated it.

The new fluidity of companies changed people’s relationships with their employers. Why climb a corporate ladder that might be yanked out from under you? Ambitious people started to think of a career less as climbing a single ladder than as a series of jobs that might be at different companies. More movement (or even potential movement) between companies introduced more competition in salaries. Plus as companies became smaller it became easier to estimate how much an employee contributed to the company’s revenue. Both changes drove salaries toward market price. And since people vary dramatically in productivity, paying market price meant salaries started to diverge.

By no coincidence it was in the early 1980s that the term “yuppie” was coined. That word is not much used now, because the phenomenon it describes is so taken for granted, but at the time it was a label for something novel. Yuppies were young professionals who made lots of money. To someone in their twenties today, this wouldn’t seem worth naming. Why wouldn’t young professionals make lots of money? But until the 1980s being underpaid early in your career was part of what it meant to be a professional. Young professionals were paying their dues, working their way up the ladder. The rewards would come later. What was novel about yuppies was that they wanted market price for the work they were doing now.

The first yuppies did not work for startups.

AM2407 Spark blog: 1980s - The Yuppie

That was still in the future. Nor did they work for big companies. They were professionals working in fields like law, finance, and consulting. But their example rapidly inspired their peers. Once they saw that new BMW 325i, they wanted one too.

Underpaying people at the beginning of their career only works if everyone does it. Once some employer breaks ranks, everyone else has to, or they can’t get good people. And once started this process spreads through the whole economy, because at the beginnings of people’s careers they can easily switch not merely employers but industries.

But not all young professionals benefitted. You had to produce to get paid a lot. It was no coincidence that the first yuppies worked in fields where it was easy to measure that.

More generally, an idea was returning whose name sounds old-fashioned precisely because it was so rare for so long: that you could make your fortune. As in the past there were multiple ways to do it. Some made their fortunes by creating wealth, and others by playing zero-sum games. But once it became possible to make one’s fortune, the ambitious had to decide whether or not to. A physicist who chose physics over Wall Street in 1990 was making a sacrifice that a physicist in 1960 wasn’t.

The idea even flowed back into big companies. CEOs of big companies make more now than they used to, and I think much of the reason is prestige. In 1960, corporate CEOs had immense prestige. They were the winners of the only economic game in town. But if they made as little now as they did then, in real dollar terms, they’d seem like small fry compared to professional athletes and whiz kids making millions from startups and hedge funds. They don’t like that idea, so now they try to get as much as they can, which is more than they had been getting. [19]

Meanwhile a similar fragmentation was happening at the other end of the economic scale. As big companies’ oligopolies became less secure, they were less able to pass costs on to customers and thus less willing to overpay for labor. And as the Duplo world of a few big blocks fragmented into many companies of different sizes—some of them overseas—it became harder for unions to enforce their monopolies. As a result workers’ wages also tended toward market price. Which (inevitably, if unions had been doing their job) tended to be lower. Perhaps dramatically so, if automation had decreased the need for some kind of work.

And just as the mid-century model induced social as well as economic cohesion, its breakup brought social as well as economic fragmentation. People started to dress and act differently. Those who would later be called the “creative class” became more mobile. People who didn’t care much for religion felt less pressure to go to church for appearances’ sake, while those who liked it a lot opted for increasingly colorful forms. Some switched from meat loaf to tofu, and others to Hot Pockets. Some switched from driving Ford sedans to driving small imported cars, and others to driving SUVs. Kids who went to private schools or wished they did started to dress “preppy,” and kids who wanted to seem rebellious made a conscious effort to look disreputable. In a hundred ways people spread apart. [20]

Almost four decades later, fragmentation is still increasing.

Has it been net good or bad? I don’t know; the question may be unanswerable. Not entirely bad though. We take for granted the forms of fragmentation we like, and worry only about the ones we don’t. But as someone who caught the tail end of mid-century conformism, I can tell you it was no utopia. [21]

My goal here is not to say whether fragmentation has been good or bad, just to explain why it’s happening. With the centripetal forces of total war and 20th century oligopoly mostly gone, what will happen next? And more specifically, is it possible to reverse some of the fragmentation we’ve seen?

If it is, it will have to happen piecemeal. You can’t reproduce mid-century cohesion the way it was originally produced. It would be insane to go to war just to induce more national unity. And once you understand the degree to which the economic history of the 20th century was a low-res version 1, it’s clear you can’t reproduce that either.

20th century cohesion was something that happened at least in a sense naturally. The war was due mostly to external forces, and the Duplo economy was an evolutionary phase. If you want cohesion now, you’d have to induce it deliberately. And it’s not obvious how. I suspect the best we’ll be able to do is address the symptoms of fragmentation. But that may be enough.

The form of fragmentation people worry most about lately is economic inequality, and if you want to eliminate that you’re up against a truly formidable headwind—one that has been in operation since the stone age: technology. Technology is a lever. It magnifies work. And the lever not only grows increasingly long, but the rate at which it grows is itself increasing.

Which in turn means the variation in the amount of wealth people can create has not only been increasing, but accelerating.

The unusual conditions that prevailed in the mid 20th century masked this underlying trend. The ambitious had little choice but to join large organizations that made them march in step with lots of other people—literally in the case of the armed forces, figuratively in the case of big corporations. Even if the big corporations had wanted to pay people proportionate to their value, they couldn’t have figured out how. But that constraint has gone now. Ever since it started to erode in the 1970s, we’ve seen the underlying forces at work again. [22]

Not everyone who gets rich now does it by creating wealth, certainly. But a significant number do, and the Baumol Effect means all their peers get dragged along too. [23] And as long as it’s possible to get rich by creating wealth, the default tendency will be for economic inequality to increase. Even if you eliminate all the other ways to get rich. You can mitigate this with subsidies at the bottom and taxes at the top, but unless taxes are high enough to discourage people from creating wealth, you’re always going to be fighting a losing battle against increasing variation in productivity. [24]

That form of fragmentation, like the others, is here to stay. Or rather, back to stay. Nothing is forever, but the tendency toward fragmentation should be more forever than most things, precisely because it’s not due to any particular cause. It’s simply a reversion to the mean. When Rockefeller said individualism was gone, he was right for a hundred years. It’s back now, and that’s likely to be true for longer.

I worry that if we don’t acknowledge this, we’re headed for trouble.

If we think 20th century cohesion disappeared because of few policy tweaks, we’ll be deluded into thinking we can get it back (minus the bad parts, somehow) with a few countertweaks. And then we’ll waste our time trying to eliminate fragmentation, when we’d be better off thinking about how to mitigate its consequences.

Notes

[1] Lester Thurow, writing in 1975, said the wage differentials prevailing at the end of World War II had become so embedded that they “were regarded as ‘just’ even after the egalitarian pressures of World War II had disappeared. Basically, the same differentials exist to this day, thirty years later.” But Goldin and Margo think market forces in the postwar period also helped preserve the wartime compression of wages—specifically increased demand for unskilled workers, and oversupply of educated ones.

(Oddly enough, the American custom of having employers pay for health insurance derives from efforts by businesses to circumvent NWLB wage controls in order to attract workers.)

[2] As always, tax rates don’t tell the whole story. There were lots of exemptions, especially for individuals. And in World War II the tax codes were so new that the government had little acquired immunity to tax avoidance. If the rich paid high taxes during the war it was more because they wanted to than because they had to.

After the war, federal tax receipts as a percentage of GDP were about the same as they are now.

In fact, for the entire period since the war, tax receipts have stayed close to 18% of GDP, despite dramatic changes in tax rates. The lowest point occurred when marginal income tax rates were highest: 14.1% in 1950. Looking at the data, it’s hard to avoid the conclusion that tax rates have had little effect on what people actually paid.

[3] Though in fact the decade preceding the war had been a time of unprecedented federal power, in response to the Depression. Which is not entirely a coincidence, because the Depression was one of the causes of the war. In many ways the New Deal was a sort of dress rehearsal for the measures the federal government took during wartime. The wartime versions were much more drastic and more pervasive though. As Anthony Badger wrote, “for many Americans the decisive change in their experiences came not with the New Deal but with World War II.”

[4] I don’t know enough about the origins of the world wars to say, but it’s not inconceivable they were connected to the rise of big corporations. If that were the case, 20th century cohesion would have a single cause.

[5] More precisely, there was a bimodal economy consisting, in Galbraith’s words, of “the world of the technically dynamic, massively capitalized and highly organized corporations on the one hand and the hundreds of thousands of small and traditional proprietors on the other.” Money, prestige, and power were concentrated in the former, and there was near zero crossover.

[6] I wonder how much of the decline in families eating together was due to the decline in families watching TV together afterward.

[7] I know when this happened because it was the season Dallas premiered. Everyone else was talking about what was happening on Dallas, and I had no idea what they meant.

[8] I didn’t realize it till I started doing research for this essay, but the meretriciousness of the products I grew up with is a well-known byproduct of oligopoly. When companies can’t compete on price, they compete on tailfins.

[9] Monroeville Mall was at the time of its completion in 1969 the largest in the country. In the late 1970s the movie Dawn of the Dead was shot there. Apparently the mall was not just the location of the movie, but its inspiration; the crowds of shoppers drifting through this huge mall reminded George Romero of zombies. My first job was scooping ice cream in the Baskin-Robbins.

[10] Labor unions were exempted from antitrust laws by the Clayton Antitrust Act in 1914 on the grounds that a person’s work is not “a commodity or article of commerce.” I wonder if that means service companies are also exempt.

[11] The relationships between unions and unionized companies can even be symbiotic, because unions will exert political pressure to protect their hosts. According to Michael Lind, when politicians tried to attack the A&P supermarket chain because it was putting local grocery stores out of business, “A&P successfully defended itself by allowing the unionization of its workforce in 1938, thereby gaining organized labor as a constituency.” I’ve seen this phenomenon myself: hotel unions are responsible for more of the political pressure against Airbnb than hotel companies.

[12] Galbraith was clearly puzzled that corporate executives would work so hard to make money for other people (the shareholders) instead of themselves. He devoted much of The New Industrial State to trying to figure this out.

His theory was that professionalism had replaced money as a motive, and that modern corporate executives were, like (good) scientists, motivated less by financial rewards than by the desire to do good work and thereby earn the respect of their peers. There is something in this, though I think lack of movement between companies combined with self-interest explains much of observed behavior.

[13] Galbraith (p. 94) says a 1952 study of the 800 highest paid executives at 300 big corporations found that three quarters of them had been with their company for more than 20 years.

[14] It seems likely that in the first third of the 20th century executive salaries were low partly because companies then were more dependent on banks, who would have disapproved if executives got too much. This was certainly true in the beginning. The first big company CEOs were J. P. Morgan’s hired hands.

Companies didn’t start to finance themselves with retained earnings till the 1920s. Till then they had to pay out their earnings in dividends, and so depended on banks for capital for expansion. Bankers continued to sit on corporate boards till the Glass-Steagall act in 1933.

By mid-century big companies funded 3/4 of their growth from earnings. But the early years of bank dependence, reinforced by the financial controls of World War II, must have had a big effect on social conventions about executive salaries. So it may be that the lack of movement between companies was as much the effect of low salaries as the cause.

Incidentally, the switch in the 1920s to financing growth with retained earnings was one cause of the 1929 crash. The banks now had to find someone else to lend to, so they made more margin loans.

[15] Even now it’s hard to get them to. One of the things I find hardest to get into the heads of would-be startup founders is how important it is to do certain kinds of menial work early in the life of a company. Doing things that don’t scale is to how Henry Ford got started as a high-fiber diet is to the traditional peasant’s diet: they had no choice but to do the right thing, while we have to make a conscious effort.

[16] Founders weren’t celebrated in the press when I was a kid. “Our founder” meant a photograph of a severe-looking man with a walrus mustache and a wing collar who had died decades ago. The thing to be when I was a kid was an executive. If you weren’t around then it’s hard to grasp the cachet that term had. The fancy version of everything was called the “executive” model.

[17] The wave of hostile takeovers in the 1980s was enabled by a combination of circumstances: court decisions striking down state anti-takeover laws, starting with the Supreme Court’s 1982 decision in Edgar v. MITE Corp.; the Reagan administration’s comparatively sympathetic attitude toward takeovers; the Depository Institutions Act of 1982, which allowed banks and savings and loans to buy corporate bonds; a new SEC rule issued in 1982 (rule 415) that made it possible to bring corporate bonds to market faster; the creation of the junk bond business by Michael Milken; a vogue for conglomerates in the preceding period that caused many companies to be combined that never should have been; a decade of inflation that left many public companies trading below the value of their assets; and not least, the increasing complacency of managements.

[18] Foster, Richard. “Creative Destruction Whips through Corporate America.” Innosight, February 2012.

[19] CEOs of big companies may be overpaid. I don’t know enough about big companies to say. But it is certainly not impossible for a CEO to make 200x as much difference to a company’s revenues as the average employee. Look at what Steve Jobs did for Apple when he came back as CEO. It would have been a good deal for the board to give him 95% of the company. Apple’s market cap the day Steve came back in July 1997 was 1.73 billion. 5% of Apple now (January 2016) would be worth about 30 billion. And it would not be if Steve hadn’t come back; Apple probably wouldn’t even exist anymore.

Merely including Steve in the sample might be enough to answer the question of whether public company CEOs in the aggregate are overpaid. And that is not as facile a trick as it might seem, because the broader your holdings, the more the aggregate is what you care about.

[20] The late 1960s were famous for social upheaval. But that was more rebellion (which can happen in any era if people are provoked sufficiently) than fragmentation. You’re not seeing fragmentation unless you see people breaking off to both left and right.

[21] Globally the trend has been in the other direction. While the US is becoming more fragmented, the world as a whole is becoming less fragmented, and mostly in good ways.

[22] There were a handful of ways to make a fortune in the mid 20th century. The main one was drilling for oil, which was open to newcomers because it was not something big companies could dominate through economies of scale. How did individuals accumulate large fortunes in an era of such high taxes? Giant tax loopholes defended by two of the most powerful men in Congress, Sam Rayburn and Lyndon Johnson.

But becoming a Texas oilman was not in 1950 something one could aspire to the way starting a startup or going to work on Wall Street were in 2000, because (a) there was a strong local component and (b) success depended so much on luck.

[23] The Baumol Effect induced by startups is very visible in Silicon Valley. Google will pay people millions of dollars a year to keep them from leaving to start or join startups.

[24] I’m not claiming variation in productivity is the only cause of economic inequality in the US. But it’s a significant cause, and it will become as big a cause as it needs to, in the sense that if you ban other ways to get rich, people who want to get rich will use this route instead.

Thanks to Sam Altman, Trevor Blackwell, Paul Buchheit, Patrick Collison, Ron Conway, Chris Dixon, Benedict Evans, Richard Florida, Ben Horowitz, Jessica Livingston, Robert Morris, Tim O’Reilly, Geoff Ralston, Max Roser, Alexia Tsotsis, and Qasar Younis for reading drafts of this. Max also told me about several valuable sources. Essay from http://paulgraham.com/re.html

Bibliography

Allen, Frederick Lewis. The Big Change. Harper, 1952.

Averitt, Robert. The Dual Economy. Norton, 1968.

Badger, Anthony. The New Deal. Hill and Wang, 1989.

Bainbridge, John. The Super-Americans. Doubleday, 1961.

Beatty, Jack. Collossus. Broadway, 2001.

Brinkley, Douglas. Wheels for the World. Viking, 2003.

Brownleee, W. Elliot. Federal Taxation in America. Cambridge, 1996.

Chandler, Alfred. The Visible Hand. Harvard, 1977.

Chernow, Ron. The House of Morgan. Simon & Schuster, 1990.

Chernow, Ron. Titan: The Life of John D. Rockefeller. Random House, 1998.

Galbraith, John. The New Industrial State. Houghton Mifflin, 1967.

Goldin, Claudia and Robert A. Margo. “The Great Compression: The Wage Structure in the United States at Mid-Century.” NBER Working Paper 3817, 1991.

Gordon, John. An Empire of Wealth. HarperCollins, 2004.

Klein, Maury. The Genesis of Industrial America, 1870-1920. Cambridge, 2007.

Lind, Michael. Land of Promise. HarperCollins, 2012.

Mickelthwaite, John, and Adrian Wooldridge. The Company. Modern Library, 2003.

Nasaw, David. Andrew Carnegie. Penguin, 2006.

Sobel, Robert. The Age of Giant Corporations. Praeger, 1993.

Thurow, Lester. Generating Inequality: Mechanisms of Distribution. Basic Books, 1975.

Witte, John. The Politics and Development of the Federal Income Tax. Wisconsin, 1985.

 

New Doc About A Pioneer Of Modern Design

Architect Eliot Noyes was one of the leading pioneers of modern design during the mid-century, post-war boom in the United States. Educated by Walter Gropius at Harvard, Noyes did more than anyone to align the Modernist design ethos to the needs of ascendant corporate America. His impact on companies like IBM and Mobil Oil paved the way for Apple and many of the other design-conscious brands we know today.

Modernism, Inc. follows Noyes’ career up to a time when disruptive designers of the early ’70s began pushing back against the Modernist view of progress. As he did in Eames: The Architect and the Painter (co-directed with Bill Jersey), filmmaker Jason Cohn uses the story of a mid-century icon to raise contemporary questions about the role of a designer in today’s world.
Jason Cohn will appear in person at both the New York and Los Angeles openings.
“Well-researched, well-crafted and fascinating.
Noyes created and forged some of the most respected, meaningful, and influential industrial design programs in the United States. The rise and attraction of modernism on the heels of the postwar economic boom birthed contemporary design and philosophy that continue to exist and flourish today, influencing mega businesses, including Apple. The idea of beauty with utility, which was generated from the Bauhaus art movement, continues to remain vital today, and for this, we thank Noyes.”
-Sabina Dana Plassa, Film Threat

“Noyes held the belief that design was not an afterthought, but rather something that needed to be infused into every aspect of a company’s thought process. The result helped IBM become the technical design juggernaut of the early personal computing era—and paved the way for companies like Apple to design products for the 21st century. Interviews with those who worked with him, as well as those in his family who knew him intimately, paint the picture of a man who was truly ahead of his time.”-Sara Harowitz, The Georgia Straight
MODERNISM, INC. Directed by Jason Cohn 79 minutes | color | In English | 2023
CREDITS

Jason Andrew Cohn  Writer and Director
Camille Servan-Schreiber  Producer Kevin Jones and Jason Cohn  Editors Steven Emerson  Music Composer Sebastian Roché  Narrator Talia Mindich  Associate Producer
Talia Mindich, Andrei Valladolid, Nina Goodby  Assistant Editors

© 2023 Bread & Butter Films
For the Silo, Kelly Hargraves.

The Benefits And Disruptions Of Blockchain Technology

It’s safe to say Blockchain technology has disrupted the internet in quite a dramatic fashion.

Despite only being invented about fourteen years ago, cryptocurrency has formed a world of its own and it is now estimated the market will hit well above a $1 trillion USD valuation from four years ago…..

Allowing transactions, alongside other things such as documents and invoices, to be sent across a P2P network, the technology has been praised for its advanced security and anonymity benefits.

The latest infographic crafted by Bitfortune looks into how many ways Blockchain technology impacted the world and various business industries, such as charities and banking.

In the world of cybersecurity, Blockchain can help reduce or eliminate fraud and errors, along with being a more accurate and confidential platform for industry professionals.

For example, REMME is a secure platform that has eliminated the need for passwords and instead uses Blockchain as a form of authentication.

In supply chain management, Blockchain is being used to reduce the number of errors and exposure to potential threats.

The technology also helps reduce time and increase efficiency – a win-win all around, really. FedEx recently announced plans for an internal Blockchain pilot program that will help solve customer disputes while IBM and Maersk are also working together on a new company that will use Blockchain within global shipping supply chains.

Take a look at the infographic below to learn how Blockchain is disrupting other industries around the world for the better and why it’s time to start thinking about how you can incorporate the technology into your lives.

For the Silo, Josh Wardini. 

16 Blockchain Disruptions Infographic

Supplemental- Why some traditional banks are broke: Fractional reserve banking.

Years Of I-phone Innovations Meant Big Expectations For Newest Models

Have things changed in the past 5 years? Take a look at this article from 2014 and let us know via the comments section below.

In the summer of 2007, Mike Lazardis, co-founder of BlackBerry, got an iPhone to check what’s inside. He pried it open and was shocked on what he saw: BlackBerry wasn’t competing with a phone, he thought, it was competing against a Mac. Lazardis was recalling that moment in an interview with The Globe and Mail, hinting about the months leading to the fall of RIM.

Such is the iPhone’s disruptive story: it put the computer in our phones and made them smart. Suddenly, we could buy and play music in our phones, surf the net via wifi, run desktop-like OS, and, the best defining factor of a smartphone, download apps. We do all that without a keypad (to BlackBerry’s shock). No, Apple didn’t invent these technologies, it innovated them. Over a decade earlier, IBM had Simon, the world’s first smartphone.

In the infographic prepared by our creative team we highlighted the key features in each iPhone launch since the first generation phone came out in 2007. Some features are truly innovative (A series chip, Siri, App Store) and some are unabashed embellishments.

So what’s in store for future iPhones? We can get some clues from Apple patents registered with the U.S. Trademark and Office. Apple is developing an audio jack to double as a headphone jack, plus an audio transducer that doesn’t need a grille to emit sound. That means future iPhones can be totally enclosed or water-proofed. Another patent talks about combining motion analyzer, scenery analyzer, and lockout mechanism to detect if you’re driving and disable Messages Apps. With the increasing text-induced car accidents, expect this feature sooner than later.

Yet another patent indicates that Apple is cooking an intelligent Home Page that brings up the app you need for specific scenarios like when you need to show an electronic ticket in an airport or an e-coupon at a counter. The patent uses location-based signals and tracks user data patterns like calendars, emails, notes, etc. to predict when to bring up the app.

But let’s not talk about the future; rather, let’s see what iPhone users want today. For the Silo, Alex Hillsberg.

iPhone6 Predicted

iPhone6 PredictediPhone6 Predicted

 

Supplemental- Are Apple products made ethically?

Peter Robinson On Job Obsolescence- Government Should Not Protect Jobs

A recent OECD report finds that low and middle income earners have seen their wages stagnate and that the income share of middle-skilled jobs has fallen. Rising inequality has led to concerns that top earners are getting a disproportionate share of the gains from global “openness and interconnection”. During a Summer 2017 meeting of OECD, employment outlook revealed that job polarization has been “driven by pervasive and skill-biased technological changes.

Founded in 1945, the United States Council for International Business (USCIB) builds awareness among business executives, educators and policy makers around issues related to employment, workforce training and skills enhancement. CMRubinWorld spoke with USCIB President and CEO Peter M. Robinson, who serves as a co-chair of the B20 Employment and Education Task Force, through which he helped develop recommendations to the G20 leaders on training for the jobs of the future. Robinson also serves on the board of the International Organization of Employers, which represents the views of the business community in the International Labor Organization.

“I think the guiding principle for government should be to protect and enable/retrain the worker, not protect the job. Policy makers and educators should focus on making sure that workers are as equipped as possible to transition to new opportunities” Peter Robinson.

Peter, welcome. How severe do you believe jobsolescence will be over the next 20 years? How big will the challenge be to offset it and maintain a growing workforce?

I really don’t think the overall effect will be as dramatic as some people fear, at least for the medium-term as far as we can tell. There is an over-hype factor at play, but the consequences still deserve serious attention. For one thing, so many of the jobs in the United States, Canada and other advanced economies are in the service sector, and involve interacting with other people. Despite all the advances in AI, we are still a long way off from robotic nurses or home health aides. Overall, history tells us that at least as many new jobs are created as are displaced by technological innovation, even though transitions can be difficult in some sectors and localities, and as long as upskilling takes place.

“The biggest threat is that our educational institutions won’t be able to keep pace with new skills demands.” — Peter Robinson

What do you think are the biggest obstacles facing college grads today trying to enter the workforce?

I actually think the greatest obstacles are faced by those who don’t make it to university or some form of higher education beyond high school (a four-year degree is not the right path for everyone). A 2014 Pew survey found that among workers age 25 to 32, median annual earnings of those with a college degree were $17,500 greater than for those with high school diplomas only. Obviously, everyone at whatever educational level needs to keep their skills sharp, and governments should join with employers and educators to instill better life-long learning. But there are far fewer established paths toward long-term employment at a middle-class level of income for those who don’t graduate from college. A greater emphasis on vocational education and apprenticeships would help. We strongly support the work being done by United States Secretary of Labor Acosta to promote apprenticeships.

Given that machines are in the process of stripping white collar workers from their jobs, what kind of skills are key manufacturing and service industries going to need from new employees?

I think the premise of your question is overstated. We’re all being told that our jobs are doomed by robots and automation. But the OECD estimates that only nine percent of jobs across the 35 OECD nations are at high risk of being automated, although of course even 9% can be generative of social difficulties. But there is an established track record across history of new technologies creating at least as many new jobs as they displace. Usually these new jobs demand higher skills and provide higher pay. The biggest threat is that our educational institutions won’t be able to keep pace with new skills demands.

“It is becoming clear that Versatility matters, in a constantly changing world, so Jim Spohrer’s IBM model of a “T-shaped” person holds true: broad and deep individuals capable of adapting and going where the demand lies.” — Peter Robinson

In an economy with a significant on-demand labor force, what competencies will these workers need to compete?

There are two types of competencies that will be needed: “technical” – or in other words, related to deep knowledge of a specific domain, whether welding or optogenetics; and “transversal,” which applies to all occupations. Those are described by the Center for Curriculum Redesign as skills (creativity, critical thinking, communication, collaboration), character (mindfulness, curiosity, courage, resilience, ethics, leadership) and meta-learning (growth mindset, metacognition).

How will managerial skill requirements change as a result of major structural changes that are likely, including human replacement by machines and growth of the on-demand economy?

OECD’s BIAC surveys of 50 employer organizations worldwide has shown that employers value not just Skills as described above, but also Character qualities as well. Further, it is becoming clear that Versatility matters, in a constantly changing world, so Jim Spohrer’s IBM model of a “T-shaped” person holds true: broad and deep individuals capable of adapting and going where the demand lies.

Canada Unemployment Rate By Provinces and Territories

“We often hear about the need for more STEM education. But I think there is an equal need for a greater emphasis on the humanities and the arts, for their intrinsic value as well as for developing skills and character qualities.” — Peter Robinson

What central changes in school curricula do you envision, both at the secondary school and college levels?

We often hear about the need for more STEM education. But I think there is an equal need for a greater emphasis on the humanities and the arts for their intrinsic value as well as for developing skills and character qualities as described above. As David Barnes of IBM wrote recently, these skills are more durable and are also a very good indicator of long-term success in employment.

How can the evolving changes in competencies required for employment be effectively translated into school curricula? Where are the main opportunities to enable this? e.g. Assessment systems? Business/Education collaboration? Curriculum change?

I’d go back to something else David Barnes said: We need much stronger connections between education and the job market, in the form of more partnerships among employers, governments and education institutions. Everyone needs to step up and create true partnerships. No one sector of society can address this alone. OECD’s BIAC has also documented employers’ wishes for deep curricular reforms to modernize content and embed competencies in order to meet today’s market needs.

What role should government play in ensuring citizens receive a quality and relevant education given the challenges that lie ahead?

I think the guiding principle for government should be to protect and enable/retrain the worker, not protect the job. Policy makers and educators should focus on making sure that workers are as equipped as possible to transition to new opportunities as these develop, and on ensuring that businesses have the freedom to pivot and adopt new technologies and business processes.

CMRubinWorld For the Silo, C.M. Rubin.  C. M. Rubin is the author of two widely read online series for which she received a 2011 Upton Sinclair award, “The Global Search for Education” and “How Will We Read?” She is also the author of three bestselling books, including The Real Alice in Wonderland, is the publisher of CMRubinWorld and is a Disruptor Foundation Fellow.

How To Present New Ideas To Your Intimidating Boss

Not all ideas are 'shot down' by an intimidating boss- Albert Einstein's formal letter paved the way to American atom bomb research.
Not all ideas are ‘shot down’ by an intimidating boss- Albert Einstein’s formal letter paved the way to American atom bomb research.

Everyone says they want innovation in their organization, but when an ambitious employee offers it to a Boss or CEO, for example, the idea is often shot down, says Neal Thornberry, Ph.D., faculty director for innovation initiatives at the Naval Postgraduate School in California.

“Senior leaders often miss the value-creating potential of a new concept because they either don’t take the time to really listen and delve into it, or the innovating employee presents it in the wrong way,” says Thornberry, who recently published “Innovation Judo,” (www.NealThornberry.com), based on his years of experience teaching innovation at Babson College and advising an array of corporate clients, from the Ford Co. and IBM to Cisco Systems. 

Neal Thornberry: " Innovation should be presented as opportunities, not ideas. Opportunities have gravitas while ideas do not!”
Neal Thornberry: ” Innovation should be presented as opportunities, not ideas. Opportunities have gravitas while ideas do not!”

Thornberry outlines a template for innovation that works:

1 Intention: Once the “why” is answered, leaders have the beginnings of a legitimate roadmap to innovation’s fruition. This is no small task and requires some soul searching.

“I once worked with an executive committee, and I got six different ideas for what ‘innovation’ meant,” he says. “One wanted new products, another focused on creative cost-cutting, and the president wanted a more innovative culture. The group needed to agree on their intent before anything else.”

2 Infrastructure: This is where you designate who is responsible for what. It’s tough, because the average employee will not risk new responsibility and potential risk without incentive. Some companies create units specifically focused on innovation, while others try to change the company culture in order to foster innovation throughout.  “Creating a culture takes too long,” Thornberry says. “Don’t wait for that.”

3 Investigation: What do you know about the problem? IDEO may be the world’s premier organization for investigating innovative solutions. Suffice to say that the organization doesn’t skimp on collecting and analyzing data. At this point, data collection is crucial, whereas brainstorming often proves to be a waste of time if the participants come in with the same ideas, knowledge and opinions that they had last week with no new learning in their pockets.

4 Ideation: The fourth step is also the most fun and, unfortunately, is the part many companies leap to. This is dangerous because you may uncover many exciting and good ideas, but if the right context and focus aren’t provided up front, and team members cannot get on the same page, then a company is wasting its time. That is why intent must be the first step for any company seeking to increase innovation. Innovation should be viewed as a set of tools or processes, and not a destination.

If you’re gonna ‘demo’ your idea you better have practiced and perfected your routine before showing your boss-

5 Identification: Here’s where the rubber meets the road on innovation. Whereas the previous step was creative, now logic and subtraction must be applied to focus on a result. Again, ideas are great, but they must be grounded in reality. An entrepreneurial attitude is required here, one that enables the winnowing of ideas, leaving only those with real value-creating potential.

“Innovation without the entrepreneurial mindset is fun but folly,” Thornberry notes.

6 Infection: Does anyone care about what you’ve come up with? Will excitement spread during this infection phase? Now is the time to find out. Pilot testing, experimentation and speaking directly with potential customers begin to give you an idea of how innovative and valuable an idea is. This phase is part selling, part research and part science. If people can’t feel, touch or experience your new idea in part or whole, they probably won’t get it. This is where the innovator has a chance to reshape their idea into an opportunity, mitigate risk, assess resistance and build allies for their endeavor.

7 Implementation/Integration: While many talk about this final phase, they often fail to address the integration part. Implementation refers to tactics that are employed in order to put an idea into practice. This is actually a perilous phase because, in order for implementation to be successful, the idea must first be successfully integrated with other activities in the business and aligned with strategy. An innovation, despite its support from the top, can still fail if a department cannot work with it.

For the Silo, Neil Thornberry.

Working 9 to 5? Think about the best times to approach your boss.
Working 9 to 5? Think about the best times to approach your boss.

Neal Thornberry, Ph.D., is the founder and CEO of IMSTRAT, LLC a consulting firm that specializes in helping private and public sector organizations develop innovation strategies. A respected thought leader in innovation, Thornberry is a highly sought-after international speaker and consultant. He  also serves as the faculty director for innovation initiatives at the Center for Executive Education at the Naval Postgraduate School in Monterey, Calif. Thornberry, author of “InnovationJudo:Disarming Roadblocks & Blockheads on the Path to Creativity, holds a doctorate in organizational psychology and specializes in innovation, corporate entrepreneurship, leadership and organizational transformation.

Fashion And Technology Connect At New York Fashion Week

What would it be like to listen in on a discussion on new materials/textiles by a director at NASA, an entrepreneur focusing on branding for dogs and an international superstar in fashion? It may sound odd at first thought, however, in the fashion industry today everyone is linked through technology – which nowadays is every brand’s biggest competitor.

Marcelo & Jordana Guimaraes (husband and wife) are the founders of FASHINNOVATION – an event bringing the thought leaders in the fashion and technology industries in front of an audience to discuss topics including production/manufacturing, sustainability, textiles/technology, branding/design, e-commerce, celebrities in fashion and “fashion-tech on the rise”(startups). These conversations will take place in moderated panels sharing a common goal: to spark the entrepreneurial mindset through inspiration and knowledge.

FASHINNOVATION New York Flyer

FASHINNOVATION will take place during New York Fashion Week on September 12th, 2018.
“I’ve been involved in the entrepreneurial eco-system throughout my life – always with a focus on technology. The challenge with the fashion and technology industries, is having them always inter connected to have a finger on the pulse of the latest innovations – making things exciting and thought provoking. With the merge of ideas, the outcome is unimaginable” says Founder, Marcelo Guimaraes.

A look back at an earlier NYC Fashion week show-

Confirmed to participate are: Vice President of Shopify/Co-Founder of Shopify Plus – Loren Padelford, Co-Founder of Ministry of Supply – Gihan Amarasiriwardena, Global Fashion Industry Leader & Partner at IBM – Karl Haller, Best-Selling Author & Host of TV Show: Elevator Pitch – David Meltzer, Designer/Founder at Kris Harring – Kris Harrington, Co-Founder of Bolt Threads – David Breslauer, Fashion Designer (Project Runway season 11 & All Stars) – Layana Aguilar, Art Creative for MAC make-up & Fenti by Rihanna – Jason Naylor, Founders of HICKIES (collaboration w/ DVF) – Mariquel Waingarten & Gaston Frydlewski, Co-Founder of M.Gemi – Ben Fischman, Executive Director at FIT’s Infor Design and Tech Lab – Michael Ferraro, Co-Founder of Universal Standard – Alexandra Waldman, Founders of Haerfest – Tim & Dan Joo, Founder of ByBorre (NIKE & Louis Vuitton collabs)– Borre Akkersdijk, Founder of Zee Dog – Thadeu Diz, Co-Founder of WindowsWear – Jon Harari, Program Executive at NASA – Jenn Gustetic and Founder of Fashion Tech Consortium – Michael Reidbord. Institutions in participation include Singularity University (NYC Chapter) and Endeavor. Part of all proceeds will be donated to NFTE.

“It’s exciting to see the positive feedback on FASHINNOVATION from leaders in the fashion and tech industries. We have additional confirmations on the way ranging from fashion personalities to platinum record superstars – the eclectic mix of people and companies participating, will ensure that everyone who attends walks away with knowledge that is unmatched” says Co-Founder & PR Director Jordana Guimaraes.

The event will resume with a cocktail networking reception, with live performances by: “violin superstar” who’s performed w/ Beyonce, etc. – Ezinma and singer/songwriter Heather LaRose – recently hitting 1 million streams online & best known for her song “New Moon” featured on MTV’s “Teen Wolf”.