Claude Shannon — considered the father of information theory — was one of the greatest engineers of the 20th century, says Rodney Brooks. In 1948, “he came up with a way of measuring the information content of a signal and calculating the maximum rate at which information could be reliably transmitted over any sort of communication channel.”
At a time when there were fewer than 10 computers in the world, Shannon speculated on their use beyond numerical calculation, including language translation and logical deductions, which arguably led to the rise of machine learning.
Read more at IEEE Spectrum.
Contact FOSSlife to learn about partnership and sponsorship opportunities.
Comments