>tr>
Is Human Species capable of celebraing intelligence as deeper (and more open) data flow than politicians printing paper money?
Economistwater.com: Do you know that even the world's biggest nations will fail in 2020s unless their peopled celebrate copiloting waters and energy transmission (CLICK TO PUZZLES of 25% more in 2020s) maps inttrligent;y?
MOTHER EARTHS CODES: ELERCTRIGICATION POWERS THINGS WITH ELECTRICITY: INTELLIGENCE EMPOWERS PEOPLES: FRESH WATER CONNECTS OUR HEALTH & EMOTIONAL COOL Please linkin with me chris.macrae@yahoo.co.uk (Wash DC) to add where we the peoples can add to these 4 spaces for unerathing humanity's intrlligence boosters-
  • Paris Inteligence Action summit February,
  • Santa Clara future of accelerrated computimng partners- nvidia santa clara Japan's Osaka Expo - 6 months in which any nations pavilion can virally survey inteligence of any other pavilion
  • Canada's G7- will all 7 nations leaders sink or swim together. Of course if we the peoples can decide what inteligences top 20 spaces need to be, we have a chance to change every education momemt og every age of person at every community around the world in line with the future of capitalism that The Economist premiered in 1976.Japan and silicon calley had payed with the first decade of moore's law - would other places be free to entrepremeurially join in the milliion times more compute in time?
  • .

    Saturday, April 13, 2024

    Nvidia ^ the World from 1993 to 2025

    Some dates approximate - we welcome rvisions & if use happy to link to your profile somewhere
    WE began eporting brainworkers words adter briefings drom Von Neumann & his peer NET (Neumann-Enstein-Turing) applied over 40 years by dad norman macrae at The Economist ; I was bien in 1951 so growing up with computers was quie curious - in london thery were'nt seen in high schools (Slide ruler was best tool maths people before i started my gap year in 1968; i worked at ministroy f transport; there a director was getting cards ounched in binary - there was a shared mainframe t the central electrcty generating biard; somrimes I was odered to carry the binary cars to and fro; apparently if one card goy t out f rder the whole verninght prgrm broke down; 

    anyhow it is sort f amazibg that even when personal comouters arrived eg 1984 beng famous year of apple mac and rical ibm pc getting microsoft software the central processing unt dir sequentail comouting 

    Enter 1993 Nvidia and jensen huang - building acceerated comoutng fir comouters sequetial one dimensonal comouting cant do
    2 years later 1995 microsoft windws arrved ad a secnd taiwanese amer=ican started up - Jerry Yang Yahoo; 
    yu can see nvsia's stry at
    blog eg featured jensen stories includeexecutive officer and a member of the board of directors.


    article 2015/6 explaims world summits piloted from 2009  includes:

    GPU Deep Learning “Big Bang”

     As I wrote in an earlier post (“Accelerating AI with GPUs: A New Computing Model”), 2012 was a landmark year for AI. Alex Krizhevsky of the University of Toronto created a deep neural network that automatically learned to recognize images from 1 million examples. With just several days of training on two NVIDIA GTX 580 GPUs, “AlexNet” won that year’s ImageNet competition (annual world championships hosted by Stanford's Fei-Fei i; , beating all the human expert algorithms that had been honed for decades. That same year, recognizing that the larger the network, or the bigger the brain, the more it can learn, Stanford’s Andrew Ng and NVIDIA Research teamed up to develop a method for training networks using large-scale GPU-computing systems.


    The world took notice. AI researchers everywhere turned to GPU deep learning. Baidu, Google, Facebook and Microsoft were the first companies to adopt it for pattern recognition. By 2015, they started to achieve “superhuman” results — a computer can now recognize images better than we can. In the area of speech recognition, Microsoft Research used GPU deep learning to achieve a historic milestone by reaching “human parity” in conversational speech.

    Image recognition and speech recognition — GPU deep learning has provided the foundation for machines to learn, perceive, reason and solve problems. The GPU started out as the engine for simulating human imagination, conjuring up the amazing virtual worlds of video games and Hollywood films. Now, NVIDIA’s GPU runs deep learning algorithms, simulating human intelligence, and acts as the brain of computers, robots and self-driving cars that can perceive and understand the world. Just as human imagination and intelligence are linked, computer graphics and artificial intelligence come together in our architecture. Two modes of the human brain, two modes of the GPU. This may explain why NVIDIA GPUs are used broadly for deep learning, and NVIDIA is increasingly known as “the AI computing company.”

    podcast
    on-demand parner videos using new accelerated comouting
    or this inm travck of major nvidia summits since 2015
    2015 Wahington DC - opening headline Deep Learning

    No comments:

    Post a Comment