No Minister

Posts Tagged ‘Mining Industry

Big Metal

leave a comment »

One of the most basic things you learn in economics is the classic supply-and-demand graph, showing how the two things interact to produce the delicate balance that establishes the price of something, be it strawberries or cell phones.

What they don’t tell you is how useless that graph is in the practice of predicting what the price will be when demand or supply (or both) change. Behind that calculation sit super-computers and richly detailed software models, but the best they produce is a range, sometimes so broad as to be practically useless..

All one can say for certain is that supply exceeds demand the price will drop, and the greater it exceeds demand the greater the price drop will be. The supply does not even have to be in the marketplace to have an impact. For example, below is a graph of US prices for LNG (Liquified Natural Gas) over the last twenty years.

There are price spikes all the time, driven by all sorts of factors. But you’ll notice that there was a steady rise up 2005, followed by a steady drop to the present day. That’s the impact of the practice of fracking and horizontal drilling in gas fields, a technology that had been developing for decades but which only took off in the mid-2000’s. Fracking showed that the USA had vastly greater reserves of gas than had been predicted only a few years earlier. In fact, previous forecasts were that the nation would have to start importing LNG, which resulted in several giant terminals being built around the coast to unload the stuff from ships. Within a few short years that had been turned around, literally. The reserves “discovered” were so vast they amounted to hundreds of years of use and the terminals were re-configured to export the stuff.

Meanwhile the impact on the price was not just to greatly reduce it but to do so as far out into the future as could be seen, and another impact was that electrical generation companies switched large numbers of coal-fired power plants to being gas-fired. That, in turn, resulted in the USA reducing its CO2 emissions on such a scale that by 2020 it was beating the reduction targets of the 1997 Kyoto Treaty, (which its Senate had rejected), and the 2009 Waxham-Markey bill (which never passed).

So if you like reducing GHG’s (Greenhouse Gases), hug a fracker and thank them.

It’s not so easy to pull the same stunt with other commodities. Metals in particular, since it’s hard to see any revolution in mining them – although there will be steady, incremental improvements, and the planet has been well searched over the centuries for sources.

But there is one potential source that could change things in the 21st century:

Astronomers have now identified two metal-rich asteroids in orbit near the Earth, with one having a precious metal content that likely exceeds the Earth’s entire reserves.

Asteroid 1986 DA is estimated to be about 1.7 miles across, based on radar data obtained during a close Earth fly-by in 2019. The second asteroid, 2016 ED85, appears to have a similar content from spectroscopy, but no radar data has as yet been obtained of it, so much less is known.

figure 13 from the paper, illustrates the amount of precious metals available in asteroid 1986 DA, compared to the world’s entire reserves (FE=iron, Ni=nickel, Co=cobalt, Cu=copper, PGM=platinum group metals, Au=gold). From this single metal asteroid a mining operation could literally double the metal that had been previously mined on Earth.

Sure, but the technology to mine those metals and transport them to Earth does not yet exist, although there have been plenty of ideas over the decades, and the basics are understood.

But even when it’s developed there’s going to be a question of cost versus revenue, which brings us right back to that supply-and-demand graph. What would happen to the price of all these metals if such a source could be mined and added to the world’s reserves? The paradox is that the price might fall so low as to make the whole effort uneconomic.

The authors of that paper actually do try to account for this price drop, but the simple fact is that it’s as much of a guess as predicting the price of strawberries when that market is flooded. You know it’ll go down but to precisely what value?

We estimated that the amounts of Fe, Ni, Co, and the PGM present in 1986 DA could exceed the reserves worldwide. Moreover, if 1986 DA is mined and the metals marketed over 50 yr, the annual value of precious metals for this object would be ∼$233 billion.

In any case, it may well be that the metals never get to Earth because heavy industry slowly moves off the planet and there will be human colonies established in space that will need the metals right there. Getting them from asteroids certainly makes more economic sense than digging them out of the Moon or another planet. That seems to be what Jeff Bezos is thinking as he pushes forward with his Blue Origin rocket company (To rouse the spirit of the Earth, and move the rolling stars):

In Bezos’ view, dramatically reducing the cost of access to space is a key step toward those goals.

“Then we get to see Gerard O’Neill’s ideas start to come to life…

“I predict that in the next few hundred years, all heavy industry will move off planet. It will be just way more convenient to do it in space, where you have better access to resources, better access to 24/7 solar power,” 


Written by Tom Hunter

November 1, 2021 at 6:53 am

Big Brains.

with 2 comments

The photo shown here is a typical CPU (Central Processing Unit) silicon chip used in desktop and laptop computers.

Specifically this is an AMD Ryzen 3 2200, a so-called “entry-level” chip for people building their own desktop gaming computers. Just a few years ago the power of such chips was still talked of in terms of the number of transistors they held, with one of the classics, the Intel 8086 of the 1980’s, having 29,000.

That AMD chip has 5 billion transistors.

As such people nowadays usually talk about other measurements of power such as clock speed and “cores”. What’s a core? Well it means a processing unit, a computer in itself. If the original silicon chips were said to be “a computer on a chip” then a 4-core chip like the Ryzen 3 2200 has four computers on a chip, and that’s pretty ordinary now. There are retailed chips with 64 cores.

Why do this? Why not just keep making a single core ever larger? Well there are scaling problems, not just in the hardware but in using a single processor to do a job. Instead, use is made of something called parallel computing, where one job is split into many, all run at the same time. It started off as something only used with the supercomputers simulating things like nuclear explosions. Parallel processing makes it possible to perform ever larger data processing jobs in human time.

The thing is that the human brain is also, basically, a parallel processor (Minsky), and a pretty massive one at that, with 100 trillion synapses (brain cells), each of which is like a transistor but with hundreds of connections to other synapses, which multiplies that 100 trillion in terms of data storage and delivers processing power beyond what should be possible given how much slower nerve impulses are compared to electronics.

The SF author Arthur C Clarke, in the book version of 2001: A Space Odyssey, actually makes reference to Minsky’s research on neural networks in explaining how the infamous computer HAL 9000 was developed, which shows you the sort of background study Clarke did for his stories. Minsky was an advisor to the movie.

For decades, most of these neural networks amounted to creating artificial “neurons” in the software, which was basically a clunking simulation, ultimately limited by the hardware it ran on. You could do interesting things, just slowly.

Which brings me to this news story, World’s Largest Chip Unlocks Brain-Sized AI Models With Cerebras CS-2.

Cores? It has 850,000 of them!

Cerebras Systems today announced that it has created what it bills as the first brain-scale AI solution – a single system that can support 120-trillion parameter AI models, beating out the 100 trillion synapses present in the human brain. In contrast, clusters of GPUs, the most commonly-used device for AI workloads, typically top out at 1 trillion parameters. Cerebras can accomplish this industry-first with a single 850,000-core system, but it can also spread workloads over up to 192 CS-2 systems with 162 million AI-optimized cores to unlock even more performance. 

Are you scared?

I am, and I’ve been in this world for most of my life.

Of course it’s going to take some time to develop the software to truly use this baby, but if we get to the point where the software-hardware configuration can start truly learning in growing itself, well…

Written by Tom Hunter

October 7, 2021 at 9:07 am