Most economists surveyed by The Wall Street Journal expected Federal Reserve officials to begin winding down their $4.5 trillion portfolio of bonds and other assets this year.
Nearly 70% of business and academic economists polled in recent days expected the Fed will begin allowing the portfolio, also called the balance sheet, to shrink by allowing securities to mature without reinvesting the proceeds at some point in 2017. Of the economists who expected a shift in the Fed’s balance sheet strategy this year, the majority predicted the process would begin in December.
In last month’s survey, just 22.2% of economists expected the Fed to begin shrinking its portfolio this year. Fewer than a quarter of economists in the latest poll expected the Fed to wait until the first quarter of next year to start to whittle down its portfolio, compared to a third last month.
In recent weeks, Fed officials have said they are discussing plans to start gradually reducing the large bondholdings the central bank accumulated during and after the financial crisis through asset-purchase programs aimed at lowering long-term interest rates and boosting economic growth.
The Fed wants to shrink the balance sheet to an undetermined size now that the economy is growing moderately, although officials haven’t decided exactly how to do it or when to start.
The central bank currently reinvests the proceeds from its maturing assets, and could decide to taper the pace of those reinvestments over several months or cease them altogether.
Fed Chairwoman Janet Yellen and other senior officials have stressed that they want the process to be gradual and predictable.
“Expect the Fed to announce tapering strategy details at the December meeting with reinvestments beginning to decline in January 2018,” Deutsche Bank Chief U.S. Economist Joseph LaVorgna said in the latest WSJ survey.
When Will the Fed Act? / Expectations for the next interest-rate increase in June have soared since the Fed last raised rates in March Source: WSJ Survey of Economists
Some of the business and academic economists polled this month said they view shrinking the balance sheet as complementary to tightening monetary policy through gradual increases in the Fed’s benchmark short-term interest rate, the federal-funds rate, since shrinking the balance sheet would likely cause long-term rates to rise.
Scott Anderson at Bank of the West said he expects the Fed to raise the fed-funds rate in June and September and then pause rate increases “for a while as they start to scale back their balance sheet.”
Gregory Daco of Oxford Economics expected the Fed to hold off on raising rates in the final quarter of this year once it begins addressing the balance sheet. “The Fed is eager to mop up excessive liquidity,” he said.
On interest rates, most economists surveyed expected the Fed to hold short-term interest rates steady at its May 2-3 policy meeting, and next raise them in June.
Nearly 80% of the economists surveyed expected the Fed will raise rates at its June 13-14 policy meeting, up from nearly 70% in last month’s survey. Just two out of 61 economists polled in April expected the next rate increase in July, 10 expected it in September and only one predicted officials will hold off until December to next raise rates.
While most economists forecast the next rate rise in June, they were divided over when the Fed will move after that. More than half, 55.7%, expected the central bank to increase interest rates to a range of 1.25% to 1.5% in September. Just under a third, 31.1%, expected the third rate increase of 2017 in December.
Economists saw just an average 14% probability of a rate increase in May.
The Wall Street Journal surveyed 61 economists from April 7 to 11, but not everyone answered every question.
Stan Druckenmiller recently elucidated: “Earnings don’t move the overall market; it’s the Federal Reserve Board… focus on the central banks and focus on the movement of liquidity… most people in the market are looking for earnings and conventional measures. It’s liquidity that moves markets.”
Even with the bond market’s muted response to the Federal Reserve’s plan to begin winding down its almost $4.3 trillion portfolio of mortgage and Treasury securities, there are plenty of reasons why the calm probably won’t last.
Out of style for almost a decade, volatility may be on its way back if you take a closer look at the mechanics of the Treasury and mortgage markets. Despite the Fed’s mantra of seeking to carry out its policy shift in a “gradual and predictable manner,” analysts say the effects of ending the reinvestment of the proceeds from maturing securities will still be felt.
This is the “most highly anticipated event in central-bank history,” said Walter Schmidt, senior vice president of structured products at FTN Financial in Chicago. “We’ve known this for two years. We’ve been waiting for this.”
While the three rounds of Fed asset purchases that became known as quantitative easing sapped volatility, former Fed Chairman Ben Bernanke’s comments in May 2013 that the central bank was considering scaling back purchases showed how quickly that can change. The so-called taper tantrum sent yields surging.
As the Fed begins to unwind, here are four reasons why we may see a renewal in volatility:
1. MBS Supply/Demand Shift
The Fed owns $1.77 trillion of agency mortgage-backed securities, about 31 percent of the market. As the central bank’s MBS holdings begin to roll off, mortgage spreads to Treasuries are going to have to widen to adjust for the additional supply, which some analysts estimate will begin at around $5 billion a month.
Since the Fed concluded quantitative easing in October 2014, the spread between Fannie Mae 30-year current coupon and Treasuries has been sitting between 90 and 114 basis points, below its historical average of about 137 basis points. Mortgage spreads may widen five to 10 basis points once the market prices in a certainty of tapering reinvestments and another 10 to 20 basis points over the longer term, Citigroup Inc. analysts estimate.
2. Increased Convexity Hedging
If the Fed decides to pause interest-rate hikes while letting the balance sheet shrink, mortgage rates are still going to rise because a large source of demand is disappearing. As a result, prepayment speeds, the pace at which borrowers pay off loans ahead of schedule, are going to fall, which will cause the duration of the securities to increase.
It’s still a double whammy if the Fed continues to raise rates. Fed tightening would push up the effective fed funds rates, also reducing prepayment speeds and increasing the average duration of the securities.
When rates rise, hedging against so-called convexity risk grows as the expected life of mortgage debt increases. That happens when refinancing slows and tends to leave holders more vulnerable to losses as lower-duration securities are more vulnerable to rising rates. By protecting against those potential losses (selling Treasuries or entering into swaps contracts), traders can end up making the bond market more turbulent.
3. Rise in Term Premium, Withdrawal from Risk Assets
As the market prepares for the Fed’s unwind, it should place upward pressure on the 10-year term premium, a measure of the extra compensation investors demand to hold a longer-term instruments instead of rolling over a series of short-dated obligations. The premium could rise 47 basis points over the course of 2018 and 2019 due to the reduction in duration, according to Bank of America Merrill Lynch strategists. Higher term premiums, coupled with increased mortgage duration could also cause a steepening of the five- to 10-year yield curve.
There’s also a chance that an increase in term premium triggers a withdrawal from risk assets such as equities, which have risen to record highs during almost a decade of accommodative Fed policy, though “the risk asset link is not as certain,” according to Bank of America strategist Mark Cabana.
4. Surge in Front-End Treasury Rates
The front end of the Treasury market will have its own set of issues when the balance sheet starts to shrink. The Treasury Department will have to decide which portion of the curve it wants to issue more securities: The front-end, where Treasury bills outstanding comprise less than 13 percent of marketable debt, or the long-end to take advantage of 30-year bonds trading around 3 percent.
“Treasury is going to need to increase front-end supply pretty notably,” Cabana said. “Banks losing reserves will be looking to replicate those assets.”
Assuming Treasury ramps up bill supply, rates on debt maturing in less than one year would likely rise, forcing up the overnight rate on Treasury repurchase agreements. That may cause usage at the Fed’s fixed-rate overnight reverse repurchase agreement facility to sink, as investors will pivot away from the operation.
“Overall, this should pressure rates higher, with banks having relatively more securities to finance in the repo market as time goes on,” said Scott Skyrm, managing director at Wedbush Securities in New York.
While Google and Facebook are the undisputed advertising leaders online, companies are increasingly looking for other digital ways to spend their marketing budgets, according to advertising and public relations company WPP CEO Sir Martin Sorrell.
"What our clients want and what our agencies want is more competition of the space, anything that gives more competition to the duopoly of Facebook and Google," Sorrell said to CNBC.
The two tech giants account for about 75 percent of digital ad budgets, according to Sorrell. But, there are competitors ready to chip away at their dominance, including AOL and Yahoo's ad tech platforms and Snap. Even Amazon is becoming a threat, with its ad platform recently valued at $350 billion, he pointed out.
"Getting more than two solutions is important," he said.
"Starting today, we will no longer serve ads on YPP [YouTube Partner Program] videos until the channel reaches 10k lifetime views," according to a blog post Ariel Bardin, vice president of product management.
"This new threshold gives us enough information to determine the validity of a channel," Bardin continued. "It also allows us to confirm if a channel is following our community guidelines and advertiser policies. By keeping the threshold to 10k views, we also ensure that there will be minimal impact on our aspiring creators. And, of course, any revenue earned on channels with under 10k views up until today will not be impacted."
YouTube's decision could have a negative side effect. As "YouTubers" know, building an audience large enough to turn a decent profit making videos takes time. Without an assist by a popular creator or a stroke of luck—such as a video getting shared by the right social media account or website—new users looking to make a go of creating videos full-time could find themselves swimming upstream.
YouTube seems to have anticipated such a consequence. "In a few weeks, we’ll also be adding a review process for new creators who apply to be in the YouTube Partner Program," Bardin continued. "After a creator hits 10k lifetime views on their channel, we’ll review their activity against our policies. If everything looks good, we’ll bring this channel into YPP and begin serving ads against their content. Together these new thresholds will help ensure revenue only flows to creators who are playing by the rules."
The US pharmaceutical industry is on the brink of a new ecosystem — but it's not taking off as smoothly as expected.
Up until the past few years, biologic drugs made from living cells didn't face competition once they lost patent protection. That's been changing with the introduction of drugs called biosimilars. But their rollout hasn't exactly been the game-changing experience some had expected.
"We believe that biosimilars will capture meaningful market share, but the disappointing commercial success so far with less than $2 billion annual sales illustrates that the bar is high," Morgan Stanley analysts said in a report on Wednesday. That's in large part because of the economic challenges that biosimilars face, the report says.
Biosimilars are a bit more complicated than your average competing medicine: Unlike generics for chemical-based drugs like antibiotics that can be interchangeable with branded versions, the copycats of biologic medications, produced using living cells, have a few more caveats.
As it stands right now, biosimilars can't be used interchangeably with branded versions, meaning if you were to get a prescription for a branded biologic, you wouldn't be able to opt for the "generic" one at the pharmacy as easily as you could if the drug was, say, a statin.
It also takes more time, energy, and money to get a biosimilar approved, compared to a generic medicine. To develop a biosimilar, it usually takes about eight years and can costs about $250 million. In comparison, a generic takes a quarter of that time (about two years) and costs a tenth of the price ($5 million) to produce.
Having more biosimilars in the US would be a big deal: It might be the best way to drive down the cost of biologic medications that have been around for a while. The savings of putting people on far less costly biosimilars — even just new patients who have never taken the original — are estimated to be billions of dollars. Express Scripts, a pharmacy benefit manager, estimated in 2013 that the US could be saving $250 billion over the next 10 years because of biosimilars.
The biologic medicine market is roughly $200 billion, according to Morgan Stanley, which makes that $2 billion a bit lackluster.
The biosimilars haven't come at much of a discount to their branded counterparts (between 15% to 30% discounts to the branded drug's list price, compared to generics that can typically charge 80-90% off the branded version). As a result of the still-relatively high cost, many people haven't transitioned over to biosimilars in the same way people have observed with generic drugs.
"While we acknowledge that biosimilars could represent a real sales opportunity, we believe that the economics of biosimilars remains challenged," the note said.
Morgan Stanley highlighted a few "winners," companies that are in the best position to make a profit off biosimiars, both from a company perspective and the drugs they're going after. Celltrion, Sandoz and Amgen are best placed, according to the US bank.
Trump's pick to lead the FDA, Dr. Scott Gottlieb echoed the disappointment in his hearing before a Senate committee Wednesday.
"Many of us have been disappointed by the economic savings we’ve seen from biosimilars so far," Gottlieb said. "But I do think there’s a lot of opportunity for these to have meaningful impact on consumers and spending going forward."
Gottlieb also pointed to some approaches he might take as commissioner, such as addressing whether biosimilars could be used interchangeably, like how generics are used.
With those changes, it's possible the future of biosimilars could shake out closer to expectations.
Grab a cup of coffee, sit back and absorb this piece which I believe, will blow your mind. I had read a good deal on self-driving cars and the implications of what lies ahead but this piece by Ben Evans has completely re-written my belief of what life will be in ten years. Wowsa! I know what I'll be dreaming about tonight. *lol* Enjoy-
There are two foundational technology changes rolling through the car industry at the moment; electric and autonomy. Electric is happening right now, largely as a consequence of falling battery prices, while autonomy, or at least full autonomy, is a bit further off - perhaps 5-10 years, depending on how fast some pretty hard computer science problems get solved. Both of these will cycle into essentially the entire global stock of (today) around 1.1bn cars over a period of decades, subject to all sorts of variables, and both of them completely remake the car industry and its suppliers, as well as parts of the tech industry.
Both electric and autonomy have profound consequences beyond the car industry itself. Half of global oil production today goes to gasoline, and removing that demand will have geopolitical as well as industrial consequences. Over a million people are killed in car accidents every year around the world, mostly due to human error, and in a fully autonomous world all of those (and many more injuries) will also go away.
However, it's also useful, and perhaps more challenging, to think about second and third order consequences. Moving to electric means much more than replacing the gas tank with a battery, and moving to autonomy means much more than ending accidents. Quite what those consequences would be is much harder to predict: as the saying goes, it was easy to predict mass car ownership but hard to predict Wal-mart, and the broader consequences of the move to electric and autonomy will come in some very widely-spread industries, in complex interlocked ways. Still, we can at least point to where some of the changes might come. I can't tell you what will happen to car repairs, commercial real-estate or buses - I'm not an expert on any of those, and neither can anyone who is - but I can suggest that something will happen, and probably something big. Hence, this post is not a description of what will happen, but of where it might, and why, with some links to further reading.
Moving to electric reduces the number of moving parts in a car by something like an order of magnitude. It's less about replacing the fuel tank with a battery than ripping out the spine. That remakes the car industry and its supplier base (as well as related industries such as machine tools), but it also changes the repair environment, and the life of a vehicle. Roughly half of US spending on car maintenance goes on things that are directly attributable to the internal combustion engine, and much of that spending will just go away. In the longer term, this change might affect the lifespan of a vehicle: in an on-demand world vehicles would have higher loading, but absent that, fewer mechanical breakages (and fewer or no accidents) might mean a longer replacement cycle, once the rate of technology implementation settles down.
Next, gas itself is bought in gas stations, of which there are about 150k in the USA. Those will also go away (unless there are radical changes in how long it takes to charge an EV). Since gas is sold at very low margins, these retailers make their actual money as convenience stores, so what happens to the products that are sold there? Some of this demand will be displaced to other retailers, and some may be going online anyway (especially if an Amazon drone can get you a bag of Cheesy Puffs in 15 minutes). But snacks, sodas and tobacco sell meaningful proportions of their total volume as impulse purchases attached to gasoline. Some of that volume might just go away.
Tobacco in particular might be interesting - well over half of US tobacco sales happens at gas stations, and there are meaningful indications that removing distribution reduces consumption - that cigarettes are often an impulse purchase and if they're not in front of you then many smokers are less likely to buy them. Car crashes kill 35k people a year in the USA, but tobacco kills 500k.
Gasoline is taxed, much less in the USA than in many other developed markets: it is 4% of UK tax revenue, for example. That tax revenue will have to be replaced, with other taxes on things that may be more elastic, and there will be economic and political consequences to that. In the USA, for example, highways are funded partly from gas taxes that have not risen to match inflation since 1993 - if just keeping it flat in real terms was politically impossible, how hard will it be to take that revenue from some other part of the economy?
Conversely, in many places (especially emerging markets) fuel is subsidised by the state - coal, gasoline and kerosene (for light and heat - see for example kerosene subsidies in India). EVs on one hand and solar on the other may change this as well.
Meanwhile, of course, we will still actually need to charge our EVs. Most estimates suggest that charging a fully electric fleet would lead to 10-20% more electricity demand. However, a lot depends on when they're charged: if they're charged off-peak this might not need more total generating capacity, though it would still change output and perhaps local distribution. The carbon impact of shifting electricity generation in this way is pretty complex (for example, over 75% of French electricity generation today comes from nuclear power), but in principle at least some grid generation almost always now comes from renewables.
More speculatively (and this is part of Elon Musk's vision), it is possible that we might all have large batteries in the home, storing off-peak power both to charge our cars and power our homes. Part of the aim here would be to push up battery volume and so lower their cost for both home storage and cars. If we all have such batteries then this could affect the current model of building power generation capacity for peak demand, since you could complement power stations with meaningful amounts of stored power for the first time.
The really obvious consequence of autonomy is a near-elimination in accidents, which kill over 1m people globally every year. In the USA in 2015, there were 13m collisions of which 1.7m caused injuries; 2.4m people were injured and 35k people were killed. Something over 90% of all accidents are now caused by driver error, and a third of fatal accidents in the USA involved alcohol. Looking beyond deaths and injuries themselves, there is also a huge economic effect to these accidents: the US government estimates a cost of $240bn a year across property damage itself, medical and emergency services, legal, lost work and congestion (for comparison, US car sales in 2016 were around $600bn). A similar UK analysis found a cost of £30bn, which is roughly equivalent adjusted for the population. This then comes from government (and so taxes), insurance and individual pockets. It also means jobs, of course.
Even simple 'Level 3' systems would cut many kinds of accident, and as more vehicles with more sophisticated systems, moving up to Level 5, cycle into the installed base over time, the collision rate will drop continuously. There should be an analogue of the 'herd immunity' effect - even if your car is still hand-driven, my automatic car is still much less likely to collide with you. This also means that cycling would become much safer (though you'd still need to live close enough to where you wanted to go), and that in turn has implications for public health. You might never get to zero accidents - the deer running in front of a car might still get hit sometimes - but you might get pretty close.
That, in turn, has consequences for vehicle design - if you have no collisions then eventually you can remove many of the safety features in today's vehicles, all of which add cost and weight and constrain the overall design - no more airbags or crumple zones, perhaps. A decade ago the NHTSA estimated that the safety measures that it mandates collectively added $839 (in 2002 dollars so $1,136 now) and 125 pounds of weight, which was 4% of both average cost and average weight - this is probably a lower bound. That, of course, presumes that there are no other changes to the design as a result of removing the human controls - which is like removing the reins from a horseless carriage and thinking nothing else will change.
As more and more cars are driven by computer, they can drive in different ways. They don't suffer from traffic waves, they don't need to stop for traffic signals and they can platoon - they can safely drive 2 feet apart at 80 mph. There is a whole range of human behaviors that reduce road capacity, especially on freeways: it's not just that people make mistakes, but that computers can drive in totally different ways to even a perfect human driver. The video below illustrates one of these issues, familiar to anyone who's been stuck in a traffic jam on a highway and got to the front to find no apparent cause - human behaviour causes traffic waves, which cause 'phantom jams'. Computers wouldn't do this, and if they did, we could stop them.
A full autonomous road system changes traffic less from fluid dynamics than from circuit-switched to packet-switched, or, more precisely, from TDMA to CDMA. No lanes, no separation, no stopping distances, and no signals, (except of course for pedestrians to cross), means profoundly different traffic patterns.
Clearly, all of this will have some effect on congestion and road capacity. Accidents themselves cause as much as a third of congestion (estimates vary a fair bit and depend whether you're talking about highways or city centres), even if there are no changes from different driving behavior. How much changes over all, though - how much more traffic can a highway hold? How much more quickly do you get to school in the morning if you drive at the same speed but don't have to stop at every stop sign just in case there's someone there? We'll find out.
However, the impact of autonomy on traffic and congestion is more complex than just making driving itself more efficient. Though automatic driving should increase capacity, we have known for a long time that increased capacity induces more demand - more capacity means more traffic. If you reduce congestion, then more people will drive, either taking new trips or switching from public transport, and congestion might rise back to where you started. Conversely, removing capacity can actually result in less congestion (and there's more complexity here too - for example, Braess' paradox). So, autonomous driving gives us more capacity, and in a sense it does so for free, since we don't have to build roads, just wait for everyone to buy new cars, but it also gives us more use.
Parking is another way that autonomy will add both capacity and demand. If a car does not have to wait for you in walking distance, where else might it wait, and is that more efficient? Does that enable better land use, better traffic routing and more or less congestion? And, in parallel, everything that you do to make traffic, driving and now also parking more efficient tends to generate more demand.
So, the current parking model is clearly a source of congestion: some studies suggest that a double-digit percentage of traffic in dense urban areas comes from people circling around looking for a parking space, and on-street parking ipso facto reduces road capacity. An autonomous vehicle can wait somewhere else and an on-demand one just drops you off and goes off to collect other people. On the other hand, both of these models create new trips as well - both your car and an on-demand car would have to come to get you (though, since cars will be automatic, they will form an orderly queue). But with enough density of on-demand, the car you get into might be the car that's already passing, or that dropped someone else off 50 feet away - it all depends on the load factor.
Parking itself is important not just as a part of the traffic and congestion dynamic but as a cost and as a use for property. As mentioned above, some parking is on-street, and so removing it adds road capacity or allows you to add more space for pedestrians. Some of it is at work or retail, or more generally in city centres, and so that land becomes available for other uses. And some of it is at home, either on-street (again using capacity) or in drives and garages, parking lots or parking structures, which add to the cost of housing. The extreme case here is Los Angeles: it has been estimated that 14% of the incorporated land of LA county is used for parking. Adding parking to a new development pushes up construction costs: parking garages cost money, and so does leaving land vacant for parking lots. A study in Oakland, in the San Francisco Bay Area, found that government-mandated parking requirements pushed up construction costs per apartment by 18%. Back in LA, adding underground car-parking to a shopping mall might double the construction cost. If you both remove those costs on new construction, and make that space available for new uses, how does that affect cities? What does it do to house prices, or to the value of commercial real-estate?
Pretty much all of these themes feed into the potential of on-demand. If you remove the cost of the human driver from an on-demand trip, the cost goes down by perhaps three quarters. If you can also remove or reduce the cost of the insurance, once the accident rate has fallen, it goes down even further. So, autonomy is rocket-fuel for on-demand. This makes it much easier for many more people to dispense with a car, or only have one, or leave their car at home and take an on-demand ride for any given trip.
This obviously has consequences for parking - an on-demand ride to work or a restaurant removes parking in the city centre, and not owning a car and substituting on-demand entirely removes demand for residential parking. And, as mentioned above, using an on-demand ride instead of looking for parking gets rid of one kind of traffic but creates a new kind - potentially a smaller one, through.
However, truly cheap on-demand has more consequences still. For example, it displaces demand from public transport - though the cost of a bus driver is also large part of the cost of the trip, and those drivers might not be needed either, so buses might also be cheaper. Conversely, if congestion falls then buses could become more attractive than other forms of transport (both cars and also subways) because the journey time would be shorter (or at least more predictable). This of itself has all sorts of cascading effects. Do you end up with reduced bus schedules? Do marginal bus-routes close, pushing people onto on-demand who might not otherwise have used it - if they can use it? Does a city provide, or subsidise, its own-demand service to replace or to supplement buses in lower-density areas? Does your robotaxi automatically drop you off at a bus stop on the edge of high-traffic areas, unless you pay a congestion charge? This all then ripples back into congestion - buses carry people at higher density than cars, and so replacing a fully loaded bus with cars would inherently create more traffic volume, but buses do not in fact travel full all of the time, and can create their own congestion (an endemic issue in London's Oxford street, for example). And, especially on Oxford Street, they carry more people than cars because they're aggregating people onto a single route who might otherwise have taken many other separate, more direct or more efficient routes. If 50 people on a bus switch to cars, they won't all be on the same road at the same time. Meanwhile, the fixed cost of a bus creates a minimum loading level and density at which a bus is practical - breaking this apart into smaller vehicles - maybe with one passenger, maybe with 10 - might extend 'public' transport to many more people.
Perhaps the most useful way to think about this is that, just as on-demand erodes the difference between marked and mechanically metered taxis and car-services, so it also erodes the difference between both of those and buses. What exactly are the differences in traffic dynamics between a Lyft Line shuttle with 5 passengers and a municipal bus with an off-peak load of 10? Recall, too, that buses weren't always municipal, and there are parallel commercial alternatives today - see Chariot, or matutus.
The point here is not remotely to suggest that it is inherently good or desirable to replace public transport with cars, but that it now becomes possible to do so, if we want, and that it might be cheaper and more efficient in some circumstances. And, indeed, that the distinction between 'car' and 'bus' might break down.
Then, of course, there are the drivers. There are something over 230,000 taxi and private car drivers in the USA and around 1.5m long-haul truck-drivers. The question of what happens to taxi and on-demand drivers has been discussed too widely and publicly for me to add anything here, but long-haul truck drivers have some interesting nuances (I'm here excluding local delivery drivers as they're often needed for more than driving the truck itself and robotics is a whole other conversation). The average age of a long-haul driver is now 49, and around 90 thousand leave the industry every year, half though retirement. The industry thinks it has a shortage of around 50,000 drivers, and growing - people are leaving faster than they can be replaced. Truck driving can be an unhealthy, uncomfortable job with a difficult lifestyle. Hence, on these numbers, over half the current driver base will have left in ten years, around the time that most people think full, level 5 autonomy might be working. In the short term, level 4 autonomy makes truck-driving more attractive, since you can rest in the back of the truck until you're needed instead of having to stop at mandated times. But on a 20-30 year view, which is really the timeline to think about this transition, effectively all current truck drivers will have quit anyway - you won't replace them, but you won't necessarily put anyone directly out of work - until you start looking at truck stops, which takes us right back to the convenience store discussion at the beginning of this piece. And meanwhile, truck-stop operators are already starting to think about the fundamentally different trucking patterns that might come from a shift in the logistics industry away from serving traditional retail and towards serving ecommerce (i.e Amazon).
Pulling all of these threads together: if parking goes away, road capacity increases by, perhaps, several times, and an on-demand ride is the cost of a coffee, then one needs to start thinking much more generally, not just about cars, trucks and roads but cities, land use and real-estate. In fact, one needs to think about cities. Cars have remade cities over the past century, and if cars are now going to change entirely, cities will change too.
So, big-box retail is based on an arbitrage of land costs, transport cost and people's willingness to drive and park - how does autonomy change that? How do cities change if some or all of their parking space, especially in town centres, is now available for new needs, or dumped on the market, or moved to completely different places? Where are you willing to live if 'access to public transport' is 'anywhere' and there are no traffic jams on your commute? Does an hour-long commute with no traffic and no need to watch the road feel better or worse than a half-hour commute stuck in near-stationary traffic staring at the car in front? How willing are people to go from their home in a suburb to dinner in a city centre on a dark cold wet night if they don't have to park and an on-demand ride is cheap? What happens to rural pubs if you don't have to worry about drink-driving anymore? And what do you DO in the car, while it's taking you somewhere? Long Netflix and brewers, short BAT - and medevac helicopters.
Finally, remember the cameras. Pretty much every vision of automatic cars involves them using HD, 360 degree computer vision. That means that every AV will be watching everything that goes on around it - even the things that are not related to driving. An autonomous car is a moving panopticon. They might not be saving and uploading every part of that data. But they could be.
By implication, in 2030 or so, police investigating a crime won't just get copies of the CCTV from surrounding properties, but get copies of the sensor data from every car that happened to be passing, and then run facial recognition scans against known offenders. Or, perhaps, just ask if any car in the area thought it saw something suspicious.
Wall Street pros say bull markets don’t die of old age. But after eight years of rising stock prices, being on the lookout for signs of a market peak makes good financial sense.
No bull lasts forever. Good times eventually are followed by bad ones, as investor euphoria gives way to fear and despair. The performance history of the Standard & Poor’s 500 stock index drives home the point: The 12 bull markets since the 1930s have all been followed by bear markets, or downturns of 20% or more, according to S&P Dow Jones Indices. The average bear market decline is a sizable 40%. Then there’s the mega-bears like the 2007-2009 rout during the financial crisis that knocked the S&P 500 down 57% and the nearly 50% slide after the internet stock bubble burst in 2000.
The current bull run, the second-longest in history and one that's generated a fourth-best gain of 254%, will eventually tire out, hit one final peak and head lower like all the rest.
The only question is when?
James Stack, a market historian and president of money-management firm InvesTech Research, says there are seven warning flags that can signal trouble ahead. The more flags that are present at one time, the more danger there is. Only one of those warning signals is now flashing yellow, he says.
The other good news, he adds, is that bull markets don’t typically “end with a big bang.” Market tops are usually slower-moving events that play out over many weeks, which gives investors time to prepare.
Here are Stack's seven warning flags:
The more bullish, optimistic and confident the investing public is, the riskier the market becomes. “Bear markets bottom in doom and gloom," says Stack. "Bull markets peak when optimism is highest.”
So what are signs of “extreme optimism”?
Bullish headlines in the news, such as the recent Barron’s cover story, “Next Stop: Dow 30,000.” Hot IPOs, like Snap's 44% jump in its debut last week. A dearth of scared investors, measured by a closely followed "fear gauge," dubbed the VIX, which is now hovering near an all-time low. Skyrocketing consumer confidence measures, such as the Conference Board’s February survey, which registered its highest reading in 15 years. The Dow Jones industrial average’s recent run of 12 record highs in a row also fits the bill. Rising stock valuations are another red flag, and currently the market is trading at close to 20 times earnings, well above the historical average and double where it traded back in March 2009.
“We have exuberance now,” Stack says, noting this is the only yellow flag from the market so far.
Stocks kick off bullish March with a roar
The 8-year-old bull has been powered by zero interest rates for nearly a decade. But the Federal Reserve has hiked short-term rates twice in the past 15 months to 0.75%. Fed chair Janet Yellen warns that three more hikes of a quarter percentage point apiece could come this year, with the next hike possibly coming at the Fed's meeting next week. Future markets place nearly 90% odds of a rate increase on March 15, according to CME Group.
In the past, stock market uptrends have been derailed by the Fed hiking rates faster and more aggressively than expected. Higher rates slow down the economy, which hurts the profitability of U.S. companies, a key propellant of stock prices. Higher rates also make it harder for borrowers to keep up with their debt payments, which could dent consumer spending and undermine the health of businesses with high debt loads.
“Fed policy," Stack says, "is a very important wildcard.”
“Bear markets and recessions go hand in hand,” Stack says. Seven of the past eight bull markets were undone by economic contractions, RBC Capital Markets data show. Recessions cause job losses, crimp consumer spending and squeeze corporate profits. Signs of trouble include weaker-than-expected incoming economic data, especially the Conference Board’s Leading Economic Index, which consists of 10 data points that predict future economic performance. If quarterly GDP, or economic growth, starts to slow, that’s another red flag, Stack warns. Any signs that the manufacturing or services segments of the economy are turning down is also a bad sign. The latest reading on fourth-quarter 2016 GDP, however, was 1.9%, down from 3.5% in the third quarter, but far from the recessionary danger zone. First-quarter 2017 economic growth is estimated at 1.9%, Barclays says.
American shoppers account for roughly two-thirds of U.S. economic activity. So any signs that consumers are not spending as much is cause for alarm. In February, the Conference Board's closely followed consumer confidence index hit 114.8, its highest level since July 2001. That’s a far better reading than when confidence plunged below 30 in 2008 during the financial crisis.
When stock market leaders, or bellwether stocks that are sensitive to changes in the economy, start to turn down after profitable advances, that's an early sign that investors are losing confidence in the market, Stack says. Stocks to watch: ones that do best when times are good, such as banks, transportation companies and businesses that sell stuff to consumers that isn’t needed for daily survival.
A rising market driven by fewer and fewer stocks is a bearish sign. Clues include more stocks going down than up on a daily basis and more stocks hitting 52-week lows than highs. “It is one of the most reliable bear market warning flags,” says Stack.
When the number of stocks hitting their lowest price levels in a year starts to swell, and if the new low list grows day after day, it’s a sign that the "smart money," or professional investors, are bailing out of the market. “It means investors are becoming desperate to sell, even ... at a loss,” Stack says.
For now, most of these warning flags are not flashing yellow, says Stack. But the fact the bull market has lasted so long makes him “nervous and more watchful of these warning flags.”
Technology is only as good as the materials it is made from.
Much of the modern information era would not be possible without silicon and Moore’s Law, and electric cars would be much less viable without recent advances in the material science behind lithium-ion batteries.
That’s why graphene, a two-dimensional supermaterial made from carbon, is so exciting. It’s harder than diamonds, 300x stronger than steel, flexible, transparent, and a better conductor than copper (by about 1,000x).
If it lives up to its potential, graphene could revolutionize everything from computers to energy storage.
Graphene: Is It the Next Wonder Material?
The following infographic comes to us from 911Metallurgist, and it breaks down the incredible properties and potential applications of graphene.
While the properties and applications of graphene are extremely enticing, there has one big traditional challenge with graphene: the cost of getting it.
The Ever-Changing Graphene Price
As you can imagine, synthesizing a material that is one atom thick is a process that has some major limitations. Since a sheet of graphene 1 mm thick (1/32 of an inch) requires three million layers of atoms, graphene has been quite cost-prohibitive to produce in large amounts.
Back in 2013, Nature reported that one micrometer-sized flake of graphene costed more than $1,000, which made graphene one of the most expensive materials on Earth. However, there has been quite some progress in this field since then, as scientists search for the “Holy Grail” in scaling graphene production processes.
By the end of 2015, Deloitte estimated that the market price per gram was close to $100. And today, graphene can now be ordered straight from a supplier like Graphenea, where multiple products are offered online ranging from graphene oxide (water dispersion) to monolayer graphene on silicon wafers.
One producer, NanoXplore, even estimates that graphene is now down to a cost of $0.10 per gram for good quality graphene, though this excludes graphene created through a CVD process (recognized as the highest level of quality available for bulk graphene).
The following graphic from Nature (2014) shows some methods for graphene production – though it should be noted that this is a quickly-changing discipline.
As the price of graphene trends down at an impressive rate, its applications will continue to grow. However, for graphene to be a true game-changer, it will have to be integrated into the supply chains of manufacturers, which will still take multiple years to accomplish.
Once graphene has “real world” applications, we’ll be able to see what can be made possible on a grander scale.
However, when you look under the surface of the market-cap-weighted indexes at median valuations they are currently far more extreme than they were back then. As my friend John Hussman puts it, this is now “the most broadly overvalued moment in market history.”
Critics will say “valuations aren’t an effective timing tool.” I’m not saying they are. But if you believe that “the price you pay determines your rate of return” then at current prices you must believe we currently face some of the worst prospective returns in history.
Believe it or not, autonomous vehicles have been many decades in the making.
Even in 1939, General Motors had an exhibit called “Futurama” at the New York World’s Fair that presented a model of the world 20 years in the future. Central to this display was a system of automated highways and vast suburbs, with a focus on how automation could reduce traffic congestion and lead to the free-flowing movement of people and goods.
Since then, many autonomous vehicle concepts have popped up at various times – but they have always fell short due to technical limitations. Only recently, due to advances in technology, have self-driving cars been able to overcome three primary engineering challenges: sensing the surrounding environment, processing information, and reacting to that environment.
Today, the future for autonomous vehicles is bright, and it is expected that there will be millions of self-driving cars on the road by 2035, creating a multi-billion dollar market.
Autonomous Vehicles: What You Need To Know
The following infographic comes to us from Get Off Road, and it shows the history of autonomous vehicles, how they work, the technical challenges overcome so far, and what the near-future of driverless cars may look like.
Over the last few years, we’ve seen a significant downtick in the number of IPOs issued by companies, but will 2017 break that trend? So far this year we have seen five companies go public on a U.S. stock exchange, and today we saw the first tech IPO of the year with Snap, Inc.
Snap, Inc. is technology and social media company known for its mobile app Snapchat, which allows users to share photos and videos with friends for moments to hours before disappearing. Founded in July 2011, what began as a tech start-up garnered 23 active investors and raised around $2.6 billion in venture capital backing.
Now that Snap, Inc. has gone public with an IPO priced at $17 per share, ahead of the expected $14-$16 a share range, it’s trickier to forecast its performance. Looking at some of Snap’s numbers, investment attractiveness it likely to be in the eye of the beholder.
The company’s year-over-year sales growth as well as its EBIT growth are certainly noteworthy. Snap’s year-over-year sales growth is roughly 589% with $58.7 million in revenues in 2015 and $404.5 million in revenues in 2016. Additionally, it saw 36.32% decline in EBIT, which is significantly less than an aggregate of companies similar to it.
The biggest buzz since Snap’s initial IPO announcement was the large net losses of the company; surprising, since historically the company had seemed profitable to the public eye. With net losses of over $514 million, many investors are worried about Snap’s ability to become profitable over time. Additionally, year-over-year net income declined 38%. While this has probably influenced some potential shareholders, it seems like there is still a lot of optimism in the investing community around the Snap IPO and growth for the company in the future.
To give perspective on Snap’s recently reported financials, the S&P 500 Technology Sector as a whole saw a 2.26% decline in the 12 months ending in December 2016. Perhaps more significantly, Snap’s fellow social media competitors Twitter (TWTR) and Facebook (FB) saw 18.4% and 98.4% growth in EBIT respectively.
Comparing Snap to Previous IPOs
Compared to its peers in the tech industry, Snap’s sales growth gets solid marks, but its EBIT growth could conceivably raise concerns. Is this something interested investors can bank on for post-IPO performance? We can look back at Twitter and Facebook the year leading up to their IPOs to understand the performance of post-IPO tech companies.
Looking first at Twitter, which had its initial public offering in November 2013, sales growth was also very strong, but EBIT and net income growth were lacking, as with Snap. Since then, sales growth has decreased from over 109% year-over-year to roughly 14% year-over-year in 2016. While EBIT growth increased from -724% in 2013 to over 18% in 2016, the company appears to have struggled to keep that number positive. There is a similar trend with net income growth over time as well. These metrics are very similar to Snap’s pre-IPO numbers.
Facebook, another key competitor of Snap, has a slightly different story than Twitter. While Facebook also had strong sales growth and negative EBIT and net income growth prior to its IPO in 2013, it has consistently grown year-over-year since then to become currently profitable. Sales have consistently grown almost 50% year-over-year since 2013. Additionally, Facebook was able to turn around negative EBIT growth prior to its IPO in 2012 by increasing EBIT by over 442% in 2013. Since its IPO, Facebook has had positive net income, increasing in 2016 by an astounding 177%.
Ahead of its IPO, SNAP has significantly more capital at its disposal than any of its peers at the same time, with 24 investors and $14.29 billion.
aggregate investor amount (millions)
As we’ve seen, an estimated $2.62 billion of that capital was raised from venture capital, while Facebook ($1.54) and Twitter ($1.55) both had more than billion dollars less prior to their IPO. Snap has clearly been able to raise more funds, which could translate into immediate performance, unlike with Twitter and Facebook, which performed poorly early on.
Looking at the IPOs of Facebook, Twitter, and wearable technology company Fitbit, Inc., on an absolute basis, we see an overall price decline over the six months following their IPOs. When comparing the companies to the market, only Fitbit, Inc. outperformed, and that was after suffering initial declines in its first few weeks.
We don’t know what the Snap IPO could lead to down the line. However, with financials that line up closely with peers Twitter and Facebook, two divergent paths seem possible. Will Snap follow in the footsteps of these competitors, or will it start a new trend altogether in the tech industry? With an IPO said to be the biggest since Alibaba (BABA) in 2014, it will be interesting to see if the company can live up to the hype.
I found this interesting (the rise) however I have my own reservations because of the possible change in rates and inflation in 2017. When inflation rises, interest rates also normally rise to maintain real rates within an appropriate range. PE ratios need to decline to reflect the increase in the earnings discount rate. Another way to look at it is that equities then face more competition for money from fixed income instruments. The cost of equities must therefore decline to keep or attract investors. Then there is the Rule of 20 to consider. Rule of 20 equals P/E + long term interest rates (average of 10 and 30 yr bond rates). If at or below 20 minus inflation -- the market is a buy. If above 20 minus inflation -- the market is a sell. Today we're at just about 20. I think I'll keep my cautious side up. Keep moving up my alerts and stick to only brief swings. Something tells me it's going to be an interesting year. All focus on the Fed and inflation.
During the past week (on February 15), the value of the S&P 500 closed at yet another all-time high at 2349.25. As of today, the forward 12-month P/E ratio for the S&P 500 stands at 17.6, based on yesterday’s closing price (2347.22) and forward 12-month EPS estimate ($133.49). Given the high values driving the “P” in the P/E ratio, how does this 17.6 P/E ratio compare to historical averages? What is driving the increase in the P/E ratio?
The current forward 12-month P/E ratio of 17.6 is now above the four most recent historical averages: five-year (15.2), 10-year (14.4), 15-year (15.2), and 20-year (17.2).
In fact, this week marked the first time the forward 12-month P/E has been equal to (or above) 17.6 since June 23, 2004. On that date, the closing price of the S&P 500 was 1144.06 and the forward 12-month EPS estimate was $65.14.
The Drivers of Change
Back on December 31, 2016, the forward 12-month P/E ratio was 16.9. Since this date, the price of the S&P 500 has increased by 4.8% (to 2349.45 from 2238.83), while the forward 12-month EPS estimate has increased by 0.5% (to $133.49 from $132.84). Thus, the increase in the “P” has been the main driver of the increase in the P/E ratio to 17.6 today from 16.9 at the start of the first quarter.
It is interesting to note that analysts are projecting record-level EPS for the S&P 500 for Q2 2017 through Q4 2017. If not, the forward 12-month P/E ratio would be even higher than 17.6.
Rules and regulations exist to let us know what behaviors we should expect from the people we do business with. Sometimes, good sense or social convention overtake these rules — and they don’t matter so much. Just about everyone wears seat-belts these days (we all know how much they improve our odds of survival in an accident); the ranks of underage smokers have plummeted (it’s no longer cool). Once the toothpaste is out of the tube, as they say, there’s no cramming it back in.
Such is the case with the Department of Labor’s fiduciary rule. On Friday, President Trump asked the Labor Department to review the rule, which requires brokers working with retirement savers to put the interest of their clients ahead of their own. After years of work on it, the regulation was finalized last year by the Obama administration.
At first blush, this looks like a big way the Trump administration could directly affect everyday investors. As it turns out, whether the fidicuary rule hurts you isn’t up to Trump — it’s up to you.
Let me explain. Whether it is overturned by the Trump administration is besides the point. The Labor Department has already taken the key language offline (you can see the earlier text here). Even before the government announced the new standard of care for advisers on retirement accounts, the public had figured it out: Investors have been moving away from high-cost, conflicted advice (with undisclosed kickbacks to brokers on the side) and toward low-cost investment advice where the adviser acts transparently in the investor’s best interests.
They have voted with their feet, and with their dollars.
Long before the 2011 staff report of the Securities and Exchange Commission (Study on Investment Advisers and Broker-Dealers) recommended a uniform fiduciary rule for all investors, the industry was moving in that direction. The fiduciary rule is not shaping investor behavior, it is now catching up with it. It has been six years since the SEC study suggested the standard; while the commission has been stalemated by politics and the Labor Department by the new administration, they are both now far behind the curve.
— Vanguard, the industry leader in low-cost indexing, has attracted $3 trillion since the 2008 financial crisis. It now manages about $4 trillion.
— Blackrock, the world’s largest investment firm, runs over $4 trillion. It notes that it is a “fiduciary for our clients” regardless of whether the new rule is implemented
— Software-managed investing (aka robo-advisors) and Hybrid (robo/adviser combos) will be $100 billion in the next few years. They already are managing almost $75 billion, according to Michael Kitces, an expert on the advisory business. Kitces notes that just the top five robos – Vanguard Personal Advisor Services (over $40 billion), Schwab Intelligent Portfolios (over $10 billion), Betterment (over $7 billion), Wealthfront ($5 billion) and Personal Capital ($3.4 billion) – alone account for $65 billion in assets under management.
Had it gone into effect as planned in April 2017, the fiduciary rule was likely to have accelerated the process of money moving from expensive and conflicted advice to advice that is lower cost and in the clients’ best interest. Changing the new rule implementation plan won’t stop the underlying trend – at worst it might slow it somewhat.
Regardless, the change is now inevitable. Industry expectations, based on an A.T. Kearney study, are that by 2020, “the DOL’s new fiduciary rule will result in a $2 trillion asset shift” that will save investors roughly $20 billion by not having to pay commissions.
Look at the biggest wirehouses as an example. They had begun a shift toward fee-based accounts several years ago. Three years ago, 27 percent of Morgan Stanley’s client assets were fee-based; today $855 billion of $2.1 trillion in assets, or more than 40 percent of client assets, are in fee-based accounts. The shift is similar for Bank America Merrill Lynch’s more than 14,000 advisers – they report an increase from brokerage to fee-based for their $2.1 trillion in client assets. The trend was similar for Wells Fargo’s $480 billion in assets under management.
In the early 2000s, retail investors whose investments were at these big firms had commission-based brokerage accounts. The primary rules that covered the behavior of brokers were from FINRA, the industry’s self-regulating organization. As you would imagine, letting the industry regulate itself led to all manner of expensive, opaque, conflicted advice that often worked against the interest of the investor, and toward the broker’s financial interest.
Investors have decided that Caveat emptor is not what they want governing their retirement accounts. Having the fiduciary rule in place would surely protect those investors who have yet to figure out who is really working for them. It would be nice to discover that the new administration was more “investor friendly.” But it was not the rules that moved the big firms toward a fee-based business model – market forces did.
Whether the fiduciary rule stays or not, the investing public has figured out what the proper standard should be. Investors are not waiting for the government to make the finance industry put investors’ interests first.
As market forces have revealed, they are insisting on it themselves.
Facebook is getting into a new type of networking.
The social media giant said this week that it is rolling out new features in the US and Canada to let businesses post job openings, and prospective workers find and apply to them through Facebook. “This new experience will help businesses find qualified people where they’re already spending their time—on Facebook and on mobile,” the company said in a blog post.
The system Facebook debuted on Feb. 15 aims to minimize hassle for job-seekers and employers, while also giving both more reasons to use Facebook products. Businesses will be able to post jobs and track applications directly from a company Facebook page, as well as communicate with applicants through Facebook Messenger. They can also pay Facebook to promote their job listings to a wider audience.
Job-seekers will see posts in their news feed and integrated with other posts on business pages. They’ll also be able to check “Jobs on Facebook,” a designated landing page for job listings, pegged to location and sortable by industry (e.g., “real estate,” “restaurant/cafe,” “education”) and job type (e.g., “full-time,” “internship,” “volunteer”).
Already, warning cries are being issued for LinkedIn, (which ironically just underwent a redesign that makes it look a lot more like Facebook). As the chief player in the online-networking space, it’s true that LinkedIn could be in for some trouble. But such comparisons also miss a bigger point: Facebook is going after a different and much more significant job market.
Next, there’s the matter of demographics. LinkedIn caters to the mid- to high-skilled job market. Its basic platform is free, but 17% of revenue comes from selling “premium” subscriptions that range from $25 to $100 a month. The site also features “influencers,” who are typically successful businesspeople and entrepreneurs—for example, Richard Branson, Bill Gates, and Arianna Huffington.
Per a November 2016 report from Pew Research Center, 50% of Americans with a college degree or higher used LinkedIn from March to April 2016, compared with just 12% who had a high school degree or less. Forty-five percent of people earning at least $75,000 a year were on the site, versus 21% of those who make less than $30,000.
Facebook, with its mission of “connecting the world,” has appealed to a broader audience. The same Pew report found that Facebook was used by 77% to 82% of Americans of all education levels. Facebook was also used by at least 75% of US adults in every income bracket, and most popular among the lowest earners.
When it comes to matching employers with job seekers, this means Facebook has a much bigger space to play in. Facebook’s users include LinkedIn’s “thought leaders” and white-collar professionals, but they’re also people seeking hourly positions, part-time work, and other opportunities that they’d probably find on sites like Monster, Indeed, or Craigslist long before LinkedIn. Facebook’s job listings for the New York metro area currently include apprentice fitness coach, salon assistant, and professional valet driver.
“We’re taking the work out of hiring by enabling job applications directly on Facebook,” Andrew Bosworth, Facebook’s VP of business and platform, said in a statement. “It’s early days but we’re excited to see how people use this simple tool to get the job they want and for businesses to get the help they need.”
For Facebook, that’s a huge opportunity. For LinkedIn, it was only ever a missed one.
A parade of up-and-coming musicians from Universal Music took the stage at the Ace Hotel in downtown Los Angeles Saturday in a pre-Grammy Awards performance for a room full of the executives who will make or break their careers.
Talent bookers from James Corden’s late-night show, marketing executives from top brands and executives from Spotify Ltd. and YouTube looked on. Sandwiched in between tables for Apple Inc., an imposing player in online music, and Pandora Media Inc., owner of the world’s largest online radio service, sat executives from a new act trying to break onto the scene: Facebook Inc.
The world’s largest social network has redoubled its efforts to reach a broad accord with the industry, according to interviews with negotiators at labels, music publishers and trade associations. A deal would govern user-generated videos that include songs and potentially pave the way for Facebook to obtain more professional videos from the labels themselves.
“We’re hopeful that they are moving towards licensing music for the entire site,” said David Israelite, president of the National Music Publishers Association, an industry trade group.
Facebook’s interest in music rights is inextricably linked to its growing interest in video. Having siphoned ads away from print, online companies have recently targeted TV, which attracts about $70 billion in advertising a year. While Facebook faces competition from Twitter Inc. and Snapchat Inc., its main rival is Google, and music is one of the most popular types of videos on Google’s YouTube service. Facebook declined to make an executive available for an interview.
Licensing to music on Facebook would have huge ramifications for the music industry, which is fighting to grab a larger share of the money from online services. With nearly 2 billion users and a growing advertising business, Facebook could provide billions in new sales for the music industry.
Beyond the revenue gains, the music industry could use a deal with Menlo Park, California-based Facebook to exert more pressure on YouTube. Music executives have long assailed what they say is YouTube’s lax approach to copyright enforcement -- even though the video-sharing website is the most popular in the world for music, has catapulted many young artists to stardom and delivered $1 billion in ad revenue to the industry last year.
An agreement with Facebook could also serve as a blueprint for deals with other social-media companies, like Snapchat.
On the other hand, providing Facebook users with another way to get music for free could disrupt the music industry’s recent surge in sales from paid services like Spotify.
The talks with Facebook are complex, involving how to prevent copyright violations in user-generated videos, so a deal could be a couple months away or more.
Facebook has reassured the music industry that the company will police piracy and share ad sales. Music executives are also encouraged because Facebook in January hired Tamara Hrivnak, a well-liked former record executive who also spent time at YouTube.
Video consumption on Facebook has grown to billions of views over the past couple of years, as TV networks, news organizations and users experiment with the site much like they once did with YouTube. The results have encouraged Facebook to fund original videos, though those plans are still being developed.
For Facebook to obtain professional video -- both music and otherwise -- it may have to alleviate concerns about how clips will be presented. At the moment, most Facebook users see videos in their newsfeed, where a clip from a TV show may be followed by a baby photo and then a friend complaining about romantic frustrations.
Facebook must also finish a system to police copyright-infringing material akin to Content ID, the system used by YouTube. Videos on the site already feature a lot of music for which artists don’t receive royalties -- a major source of tension.
Shares of the Facebook fell 0.3 percent to $133.78 at 10:45 a.m. in New York. They closed at a record $134.20 on Feb. 8.
Israelite speaks for many in the music industry when he expresses doubts about the latest online giant to come knocking on the door. Music executives blame large technology companies for using music to sell services and devices without properly compensating artists.
“Facebook is a very valuable company, making a lot of money, and in part because of the music on the site,” Israelite said, adding the social network is protected by the Digital Millennium Copyright Act, the same law that can shield YouTube from responsibility for pirated material. “We are looking forward to being business partners with Facebook. If that doesn’t happen, you’ll see the situation turn very quickly.”
After years of declining sales, the music industry is growing again thanks to the popularity of paid streaming services from Spotify and Apple. Label executives are reluctant to give their music to another free service for fear it could slow that growth.
The music industry has spent the better part of the last year fighting YouTube in the press, and trying to get laws changed so that the video-sharing service bears more responsibility for policing clips that infringe copyrights. The labels took up that fight just as they were negotiating new long-term licensing agreements with YouTube, hoping the pressure would at least result in more favorable deals.
“Facebook definitely has the size and scale, but the tribal nature of music preferences is different than a feed or news stories or cute cat videos,’’ said Vickie Nauman, a music industry consultant. “To be successful, it will not only need to envision a great music experience but also have to navigate the web of label and publisher rights and relations. No small feat.’’
Crude oil has a tendency to bottom in mid-February and then rally through July with the bulk of the seasonal move ending in late April or early May. It is that early February low that can give traders an edge by buying ahead of a seasonally strong period. Going long crude oil’s July contract on or about February 14 and holding for approximately 60 days has been a profitable trade 27 times in 33 years, including the last three years straight, for an 81.8% win ratio with a cumulative profit of $108,660 (based upon trading a single crude oil futures contract excluding commissions and taxes).
Crude oil’s seasonal tendency to move higher in this time period is partly due to continuing demand for heating oil and diesel fuel in the northern states and partly due to the shutdown of refinery operations in order to switch production facilities from producing heating oil to reformulated unleaded gasoline in anticipation of heavy demand for the upcoming summer driving season. This has refiners buying crude oil in order to ramp up production for gasoline. Last year, crude bottomed in mid-February and that bottom was the end of crude’s multi-year bear market that began in earnest in 2014. The result was the second best performance in this trade’s history going back to 1984. Only 2008 was better.
The way that we participate in the changes that are going on in the media industry that I fully expect to accelerate from the cable bundle beginning to break down is, one, we started the new Apple TV a year ago, and we’re pleased with how that platform has come along. We have more things planned for it but it’s come a long way in a year, and it gives us a clear platform to build off of.
Apple is on the fourth generation of the Apple TV. It now has an app that makes recommendations across streaming-video services and has a universal search function; it is currently limited by only allowing you to find a program across a limited selection of third-party services, but it has the potential to become the online equivalent to a TV Guide for all programming. (The company is also developing a library of original content tied to its Apple Music subscription.)
Media experts have been forecasting the death of the traditional TV bundles for years (BTIG Research media analyst Rich Greenfield tweets with the hashtag “#goodluckbundle”)—and it hasn’t happened yet. But there has definitely been some movement, as Cook pointed out.
So far, bundling hasn’t as yet disappeared in the US. It’s just taken on new forms.
Companies like Verizon have released slimmed-down packages that allow customers to cherry pick the channels they pay for. Dish Network’s SlingTV, Sony’s Playstation Vue, and AT&T’s DirecTV Now, among others, offer live and on-demand TV packages through the internet. And Hulu and YouTube are expected to introduce live TV offerings this year.
None of these packages seem to top the user experience that you get with cable yet. But they have their benefits—they’re more affordable, the video quality is improving, and their customer-support teams can’t possibly be as miserable to deal with as those at the cable company.
But with Apple, and now Facebook, expanding into the TV business, a real contender that could take down cable might soon emerge.
In a high-profile attack on growth-killing red tape, President Donald Trump this week ordered that any agency issuing a new rule find two to repeal.
He will likely discover that the only thing harder than getting something done in Washington is getting it undone.
Vast swaths of rules are untouchable because Congress ordered them to be written or the president himself demanded them. Finding rules to repeal is a tedious and time-consuming affair that usually yields tiny savings, mostly in reduced paperwork. Ultimately, rules are passed because they have benefits, from cleaner air to fewer terror attacks, that voters or presidents aren’t willing to forgo.
The first president to tackle the leviathan was Jimmy Carter who proposed a “regulatory budget” to limit the financial burden of new rules. Every president since has tried the same. George W. Bush invited suggestions from the public on rules to repeal. Barack Obama trumpeted two executive orders requiring federal agencies to “look back” and kill off old rules that no longer justified their cost.
None halted the relentlessly rising burden of regulation.
Perhaps Mr. Trump will be different, but history offers reason for skepticism.
In a sample of 50 of Mr. Obama’s “look-backs,” 76% achieved their savings by reducing administrative costs, such as converting to electronic from paper filing, according to Cary Coglianese, director of the University of Pennsylvania’s Penn Program on Regulation. Such costs are trivial compared with compliance, such as installing pollution abatement equipment or wheelchair ramps, and lost business opportunities.
Often, the rules that get repealed “nobody cares about anymore: they aren’t imposing any costs at all because things have moved on,” Mr. Coglianese says. In 2004, for example, a rule protecting consumers from airlines’ deceptive airline ticket sales was repealed. It didn’t matter much because airlines no longer owned the major reservation systems.
Canada and Britain, which have versions of Mr. Trump’s “one in, two out” order, don’t offer much prospect for radical rollback.
“There were very few instances where we repealed regulation outright,” says Jitinder Kohli, who ran Britain’s regulatory overseer from 2005 to 2009 and is now with Deloitte Consulting. “It was very rare that the original intent of the regulation no longer made sense.”
More often, he said, there were more efficient ways to carry out the rule. Britain now mandates £3 of regulatory cost reduction for each pound of cost increase, yet its national auditor has found that costs are so poorly understood, it is unclear what good the government has achieved. For its part, Canada’s regulatory reduction program only covers paperwork costs.
Mr. Trump’s order requires the costs of any new rule be fully offset by repealed rules in any given year, but has yet to specify how to measure those costs. No cost estimate exists for many rules, and those that do haven’t been updated since the rules were passed.
Much of what business spends complying with a rule, such as designing more fuel-efficient cars, is “sunk,” and can’t be recovered. “The regulated party may even resist change,” says Susan Dudley, a Bush regulatory official and now director of the George Washington University Regulatory Studies Center. Dropping the requirement for air bags, for instance, won’t make car manufacturers stop installing them.
Mr. Trump’s order could provide a powerful prod to agencies to look for old, costly rules since, if they can’t find any, they may be unable to issue new rules. Even so, laws and courts can still demand that rules be written. Mr. Trump wants to repeal Mr. Obama’s Clean Power Plan. He would still be faced with a 2007 Supreme Court decision that the Environmental Protection Agency regulate carbon-dioxide emissions under the Clean Air Act.
There are more effective albeit less sexy ways to improve regulation: standardize the measure of both costs and benefits, force both Congress and federal agencies to submit laws and major new rules to independent cost-benefit analysis, then mandate a reassessment of the results several years later.
Yet the regulatory burden will likely keep growing so long as the priorities of Congress and the president require it. After the Sept. 11, 2001, terrorist attacks, Mr. Bush created an entire new federal department with thousands of new employees to counter terrorism. Mr. Obama’s health care and financial regulation laws required agencies to issue thousands of rules regardless of their costs and benefits.
Mr. Trump may not be not immune. He enacted a temporary ban on visitors from seven mostly Muslim countries over concerns that terrorists might enter the U.S. The president also suspended the U.S. refugee program for four months and reduced the number of refugees the U.S. will accept in fiscal year 2017 to 50,000. How would this perform on a cost-benefit test?
Alex Nowrasteh, an immigration expert at the libertarian Cato Institute, estimates the average American has a one in 3.6 billion chance each year of being killed by a refugee who becomes a terrorist, meaning there is a small benefit from the new rule. At the same time, he puts the cost of Mr. Trump’s order at $350 million per life saved since fewer immigrants make the workforce less efficient. By comparison, federal guidance usually puts the statistical value of a human life at $10 million.
Which simply proves that when a president’s priorities are at stake, the cost of regulation is seldom an object.
Auto loans have shot past the $1 trillion mark in the United States and now make up a significant component of the overall consumer debt picture.
Subprime auto loans – which are riskier loans made to customers with poor credit – have helped to drive the market since the Great Recession. However, with auto loan delinquencies ticking up in recent months, investors have been searching for answers about the sector.
Are we in for some sort of subprime auto loan crisis, or is there another explanation for what is going on?
Subprime Auto Loans: a Shifting Market
The data and perspective in today’s infographic comes from consumer credit reporting agency Equifax, and it helps to explain what is potentially going on in today’s auto loans market.
Does the recent uptick in auto loan delinquencies represent the unhinging of the market, or is it just standard fare?
Auto Loan Segmentation
The auto loan market is surprisingly diverse, and it’s comprised of many different types of lenders.
Each lender has a unique set of criteria for their ideal customer. For example, banks want very little risk and typically only lend to customers with prime credit scores (620 or higher). Dealer finance companies, on the other hand, are willing to take on more risk in their portfolios, and usually key in on subprime customers.
In fact, there are six different types of lenders in the auto lending space:
Banks: Depository institutions that loan money to third-parties
Captive Auto Finance: Financing arm of an auto brand (i.e. Ford Motor Credit Company, etc.)
Dealer Finance Companies: Associated with a dealerships or dealer chains
Monoline Finance Companies: Focus on auto loans through multiple dealers/platforms
Independent Finance Companies: Offer auto loans and other loan types
Because they each approach the market differently, there is strong segmentation in the market. The following chart from Equifax shows a snapshot of loans made in Q1 of 2015 and their cumulative non-performance after 18 months on the books:
However, let’s look at this again by plotting the median credit score for new loans originated in Q1 of 2006, 2009, 2012, and 2015.
After the financial crisis, banks tightened credit standards until performance improved. Monoline and dealer finance companies, on the other hand, continued to lend to high-risk borrowers – and it is these companies that are seeing non-performance rates shifting higher.
In other words, it is the market share and relative performance among lenders that are the change drivers for aggregate loan statistics.