Visit Paul's website
The book From Computing to
Computational Thinking
can be ordered at: computize.org
Paul S. Wang, Ph.D.
The dictionary defines the word cache as “a store or collection of items kept in a safe place”. However in computing, cache is a technical term referring to a memory for fast storage and retrieval of data. In this blog post we are going to talk about the latter.An understanding of the cache concept can focus our attention on ready access and efficiency. Knowing how cache memory works inspires us to employ the same ideas to better manage businesses, improve our daily living, and even save lives.
Paul S. Wang, Ph.D.
The Internet and the Web made the entire world a global village. Digital information travels at light speed making instant communication and interactions among people, near and far, a reality. Yet, because of their open nature, data traveling on the Internet and Web are subject to public view, making information security and privacy a real concern for almost everyone.
Paul S. Wang, Ph.D.
We communicate with patterns. Reading is to remember spelling or stroke patterns in a language. Speaking is to pronounce words in prescribed sound patterns. To understand speech, we must first hear and detect the sound patterns, connecting them to words, filter words through grammar patterns, then interpret their meaning in the right context. We can even think of the context as the overarching pattern containing the words.
Paul S. Wang, Ph.D.
Digital computers are logic machines. They use bits to store information. A bit is nothing but a switch with two states, on and off. In computer speak, we use 1 to represent the on state, 0 the off state. In logic speak, 1 is called true, and 0 false. Combining the two, a bit can either be in the state on/1/true or in the state off/0/false. In other words, inside the computer 1 means true and 0 means false.
Paul S. Wang, Ph.D.
The term Artificial Intelligence or AI has seen wider and wider use, certainly in the past five years or so. Proponents say AI is the future of information technology (IT) and companies should get on the AI bandwagon. Tractica, a market research firm, forecasted huge market growth globally for AI software, expecting revenue to increase from about 10 billion U.S. dollars in 2018 to 126 billion by 2025.
Paul S. Wang, Ph.D.
We live in a world dominated by information technology (IT). IT is everywhere and enriches our lives in countless ways. At the very heart of IT is the digital computer. Digital computers use 1’s and 0’s, nothing else, to represent and store information. There are no exceptions–all data and all programming are coded in 1’s and 0’s.