Analogs of the Christian Bible’s Epic of Noah’s Ark

In London, England in 2014, Dr. Irving Finkel, one of, if not the worlds most foremost authoritive upon cuneiform writing, published a book entitled “The Ark Before Noah” which states that a 3700 year old Sumerian tablet translated by Finkel depicts the Christian biblical story of Noah and the flood which drowned the world. This tablet is at the very least 1000 years older than that of the Biblical epic. In the Christina Bible, Noah is warned of a cataclysmic flood by God. A similar story exists in ancient Indian Vedic texts in which King Manu was forewarned by Lord Vishnu in the form a fish, of a great flood impending, with Manu constructing a large boat and ultimately surviving. In the Babylonian poem the Epic of Gilgamesh, the protagonist Utnapishtim (pronounced “ut-nah-pish-tim”) is advised of an impending flood by the god Enki (pronounced “en-kee”). In ancient Aztec culture, a sacred male and female couple hide within a hollow tree with corn while holding steady as the deluge of a great flood envelops the Earth. Ancient Celtic, Norse, and Chinese mythology also account similar stories in which a great flood occurs and only some survive. The common denominator between all of these stories is intervention by a force which knew ahead of time of the impending cataclysm

The Advent of Post It Notes

Post It notes were invented by a chemical engineer whose bookmark fell off his hymn book while singing in church. The inventor, Arthur Fry, created an adhesive which worked like a basketballs skin in that some adhesive touches the object it’s stuck to, but not very much so it’s continually sticky dependant of how it lands each time

Modern Day Slavery World Wide

It is estimated that 1,200,000 (1.2 million) people in Europe are subjected to slavery and this value leaps to a conservative estimate of 40,300,000 (40.3 million) worldwide as of 2020

The Etymology of “ChatGPT” and How Artificial Intelligence Models are Developed

OpenAI’s artificial intelligence “ChatGPT” is named as such because of the acronym “GPT” which stands for “generative pre-trained transformer”. The generative pre-trained transformer provides the characteristics of any large language model like ChatGPT. The term “generative” refers to an artificial intelligences “ability to generate new content which is similar to the data sets it was trained upon”. The term “pre-trained” refers to the “practice of artificial intelligence models being fine tuned for specific tasks, a pre-training phase in which the large language model learns from a vast amount of text data which helps the model understand language patterns and contexts”. The term “transformer” refers to the transformer architecture which is a “neural network design that relies upon a mechanism referred to as “attention” to weigh the influence of different parts of the input data”. This transformer architecture is particularly effective for tasks which involve understanding the context of language (e.g. language translation or answering questions etc.) which is why artificial intelligence models are able to understand human language in a deep and complex way