Moore’s Law in Software Abstraction

StrategyDriven Editorial Perspective Article |Moore's Law|Moore’s Law in Software AbstractionThe future of computing performance lies in software and hardware parallelism. Simply put, application programs must be expressed by splitting work into numerous computations that execute on separate processors which communicate only from time to time, or better yet never communicate at all.

For much of the last 30-years, programmers have not needed to rewrite their software in order for it to get faster. It was expected that software would run faster on the next generation of hardware as a result of the performance gained by shrinking transistors and squeezing more of them on a piece of silicon. Hence, programmers focused their attention on designing and building new applications that executed accurately on existing hardware but — anticipating the next generation of faster hardware — were often too compute-intensive to be effective. The demand for next-gen hardware has been significantly defined by software pressure.

The sequential-programming model evolved in that system as well. To create more capable software, developers relied heavily on high levels of sequential programming languages and software abstraction — reusing component software and libraries for common tasks.

Moore’s law helped to propel the advancement in sequential-language abstractions since expanding processor speed covered their costs. For example, early sequential computers were programmed in assembly language statements, which have a 1:1 mapping to the computer-executed instructions. In 1957, Backus and his colleagues recognized that assembly-language programming was difficult, so they launched the first implementation of Fortran for the IBM 704 computer. The IBM programmers wrote in Fortran, and then a compiler translated Fortran into the computer’s assembly language. The team made these claims:

  • Programs will contain fewer errors.
  • It will take less time to write a correct program in Fortran.
  • The performance of the program will be comparable to that of assembly language.

These benefits of high-level languages are now generally accepted. As computers increased in speed, modern-day programming languages added an increasing number of abstractions. Modern languages such as Java, F#, PHP, C#, Ruby, Python, and JavaScript provide features such as automatic memory management, object orientation, dynamic typing, and static typing, among others — all of which reduce the programming burden. They do that often at a performance cost, but companies chose these languages to enhance the accuracy and functionality of their software, which they valued more than performance. While the initial transition from hand-coded assembly language came with performance gains, these higher levels of abstraction often result in performance drawback.

Understanding the “memory wall”

With the programming changes that are required to move from single to multi-core processors, software developers are finding it ever more challenging to deal with the growing gap between processor performance and memory system, often referred to as the “memory wall.”

The memory wall reflects a constant shift in the balance between the costs of computational and memory operations and adds to the complexity of achieving high performance. Effective parallel computation requires coordinating computations and data — the system must collocate computations with data and the memory in which it is stored. And while chip processors have achieved great performance gains from technical advancements, main-memory bandwidth, energy accessibility, and latency have scaled at a lower rate for years.

The result is a performance gap between memory performance and processor that has become more and more significant. While on-chip cache memory is used to bridge the gap partially, even a cache with a state-of-the-art algorithm to predict the next operands needed by the processor cannot close that gap effectively. The aggregate rate of computation on a single chip will continue to outpace main-memory capacity and performance improvements. And thus, the advent of chip multiprocessors means that the bandwidth gap will almost certainly continue to widen.

To keep the memory from strictly limiting system power and performance, applications must have locality and the amount of that locality must increase. To boost locality, it is imperative to design software in a manner that will reduce (1) the communication between processors, and (2) data transfer between the memory and processors. In other words, make data travel less and minimize the distance that data needs to travel.

Mind the Gap

With the benefits of Moore’s Law winding down, the burden of increasing application performance should, ideally, be shared by software developers and hardware designers. But emerging new technologies may obfuscate those responsibilities with an interesting hybrid solution. Recent achievements by companies like SimpleMachines have given birth to new software and new hardware that synchronizes to eliminate this gap – all without the original software developer needing to be aware of the underlying transformation that takes place and without the hardware designer ever needing to know what application will be running.

The keys to these new technologies are two breakthroughs: 1) A new compiler design that focuses on understanding application intent rather than instructions and then converts that algorithm into a series of processes that are each optimized to reduce data transfer, and 2) a chip architecture that dynamically changes the computation flow to address the locality issue based on algorithm needs. These breakthroughs offer a radically new way to efficiently solve a problem that would otherwise be extremely time and capital intensive to address and does so in a way that is future-proofed against future enhancements that would otherwise only continue to increase that gap.


About the Author

StrategyDriven Expert Contributor |Dr. Karu Sankaralingam Karu Sankaralingam, PhD, is founder/CEO/CTO of Madison, WI-based SimpleMachines, Inc. (SMI). Dr. Sankaralingam started as professor of Computer Science at UW-Madison in 2007. He has 17 patents and has published 91 papers. Founded in 2017, SMI is an AI-focused semiconductor company. For more information, please visit https://www.simplemachines.ai/.

What are the best cryptocurrency exchanges in the world?

StrategyDriven StrategyDriven Editorial Perspective Article |Cryptocurrency|What are the best cryptocurrency exchanges in the world?With the growing popularity of the topic of cryptocurrencies, more exchanges offering transactions using virtual coins appear on the market. Under the pressure of competition, exchanges around the world are constantly developing, improving their platforms to be faster and more user-friendly. What are the best cryptocurrency exchanges in the world? We present three of the most popular platforms.

Binance – many years of experience

The exchange itself was established in 2017, but its founder – Changpeng Zhao – had previously worked in teams related to finance and cryptocurrencies. In 2005, he founded the company Fusion Systems, which created, among others high frequency trading systems for brokers. In 2013, Zhao joined the Blockchain.info team as a member of a group working on a portfolio dedicated to cryptocurrencies. Binance was originally based in China, but after the Chinese government banned cryptocurrency trading, the company moved to Japan. Subsequently, offices in Taiwan and Malta were opened. As early as 2018, Binance was the world’s largest cryptocurrency exchange with a market capitalization of $1.3 billion.

Binance is not only an exchange, but an entire “ecosystem”. The authors offer users, among others Binance Academy, a place where they can find free, quality materials explaining the principles of blockchain and other cryptocurrency concepts. Binance Labs is responsible for coordinating and financing projects based on blockchain technology, and Binance Research is responsible for providing high-quality analysis. The company also has its own decentralized wallet and even official Telegram channels that keep the entire community in touch. However, perhaps its most important distinguishing feature is its own cryptocurrency, Binance Coin, which enables cheaper, commission-free trading with other cryptocurrencies. The stock exchange can be operated via a web browser, a computer program, available for Windows, MacOS and Linux, and via an application for smartphones running on Android and iOS. Such an extensive structure makes Binance the best cryptocurrency exchange in the world.

More information about the Binance exchange can be found in this article: bitcoin-exchange.uk/binance

StrategyDriven StrategyDriven Editorial Perspective Article |Cryptocurrency|What are the best cryptocurrency exchanges in the world?Coinbase – safety and practicality

Coinbase was born in 2012 and its creators are Brian Armstrong and Fred Ehrsam. Initially, the team also included Ben Reeves, the co-founder of Blockchain.info, but after a few months he left the composition of the newly created project. Already in 2014, the platform was used by a million users. Currently, the number of registered and verified users is over 43 million.

As the authors noted, when creating Coinbase, they wanted everyone, regardless of place, to have easy access to cryptocurrencies. The aim was to create an open financial system that would not be under the control of any state, while allowing for quick payments and universal access to financial services. All of this is intended to help level the playing field and lift millions of people out of poverty. The strengths of Coinbase include paying special attention to safety, as well as creating an easy-to-use platform on which both beginners and professionals can operate. It is currently the largest cryptocurrency exchange in the United States in terms of the volume of trading.

BitFlyer – the whole world at your fingertips

was founded in Japan in 2014 by trader Yuzo Kano. At the same time, another Japanese cryptocurrency exchange, Mt. Gox, which handled over 70% of all bitcoin transactions in the world.

“BitFlyer is the easiest and safest way to buy and sell Bitcoin, Ethereum and more,” the authors say about the stock exchange. The company is licensed to operate throughout Europe, the United States and Japan. You can start using the platform with a small amount of just 1 euro. You can top up your account conveniently by bank transfer or via PayPal, you can also purchase with a credit or debit card. There are seven digital currencies available on the exchange, including Bitcoin, Ethereum, and Litecoin.

Summary

In this article, we have presented you only the 3 most popular cryptocurrency exchanges in Europe and for many. Of course, there are many more platforms for trading and exchanging encrypted electronic currencies. You can read more about the best cryptocurrency exchanges in this article.

How Technology Has Advanced The Cosmetic Industry

StrategyDriven Editorial Perspective Article |Technology|How Technology Has Advanced The Cosmetic IndustryAs we head further into the digital era, the use of technology has continued to revolutionise our lives in so many ways. But how has it changed the medical sector? In this article, we will be providing you with insight into how technology has advanced the cosmetic and medical industry in the last 10 years.

Ai And Computer Technology Has Improved Customisation

When looking at the cosmetic industry, there has been one common problem that several people have faced, finding the perfect shade of foundation that matches their skin. The use of AI and computer technology allows for you to scan the skin and determine the base colour as well as the undertones to find the perfect match.

Video Calls Have Improved The Consultation Process

In addition to the use of computers in-store to find the perfect shade, the world of technology has also allowed for computers to be used in the consultation process. This has been used by several medical practices in the UK as well as those providing a hair transplant procedure in Turkey to speed up the consultation process and limit the amount of travel that is needed to get to and from appointments. This way of communicating with medical staff is only set to escalate soon as we continue to adapt our way of life to accommodate Coronavirus and the social distancing restrictions that have been put in place.

Increased Performance For Cosmetic Procedures

Technology has also allowed for several cosmetic procedures to be completed in half the time that they would have been 10 years ago. With state-of-the-art technology and all reporting is done electronically, this has significantly decreased the time spent in surgeries as well as reduced the time spent recovering. For example, the use of Keyhole surgery as a replacement for several other surgical procedures has helped to limit the risk associated with some of these surgeries as well as improve the overall healing process for their patients. As a result, this has also sped up the process of patients returning to their everyday lives following a surgical procedure.

Virtual Try-On Will Replace The Traditional Samples

The final way that technology has advanced the cosmetics industry is through virtual try on. With the pandemic leading to the removal of makeup testers in stores, many brands have turned to computer technology to provide a virtual try-on to those shopping online. This has benefitted several smaller brands as well as larger companies such as Charlotte Tilbury as this is capturing not only the digital market but providing a new experience for customers to try when they return to physical stores. This is a huge benefit for so many companies as it provides them with an experience that they have never had before.

With so much to change in the next five years, there are several ways that this is set to continue in 2021. How do you think that this will continue to change in the not too distant future?

Are CEOs Really Necessary Anymore?

StrategyDriven Editorial Perspective Article |CEOs|Are CEOs Really Necessary Anymore?It seems like a ridiculous question to ask, somewhat like wondering whether cars really need drivers. Just imagine all the things a driver does every second in order to reach a specific destination: taking in vast amounts of inputs about current conditions of the vehicle’s motion, receiving thousands of changing data points from all the visual clues about lanes, traffic, signs, pedestrians and all the other moving vehicles in the vicinity, then comparing all this information to a previously set route, and making all the complex choices necessary to arrive safely.

You could almost think about that driver as being on the receiving end of a firehose of data, sorting out the most important patterns, and then turning all of that into a best course of action — the very definition of Intelligence. And that’s why we’ve come so close to going from data that one human can process, to Big Data, which requires dozens of sensors to process.

With increasingly vast bodies of knowledge about experiences, one can see how business Intelligence, with enough computing power, became Artificial Intelligence. And, so, before too long, the taxi you’re about to hail in Phoenix, shows up; Poof! No driver necessary.

Which brings us back to those folks in the corporate driver’s seat — the CEO. Doesn’t much of a CEO’s job consist of being on the receiving end of ever-increasing floods of data that can now be gleaned in real time from inputs around the globe? The tick of every sale quickly contributes to a pattern revealing how the marketplace is receiving our products at every given moment. Supply chains are linked to these inputs, as is every other variable the CEO needs to be concerned about, from available corporate resources to stock price.

And as AI begins to make choices based on mining Big Data, the role of the CEO as patchcord between data input and decision output seems destined to become smaller and smaller until, at some point, an organization is going to run autonomously. As futurist Ray Kurzweil observed in 2005, in the near future, machine intelligence is going to exceed human intelligence. He named that moment, the Singularity. Will there be a moment when the Singularity arrives in the C-suite? It seems inevitable.

AI or Human Agency?

Or maybe not. Maybe great organizations are not really machines, like some automobiles or even spacecraft, that can complete their journeys without human intervention. To find out, it may be worthwhile to make some sharp distinctions between what Big Data driving AI can do, and what it cannot. BDAI (for short) is excellent at making sense out of the current state. It’s also pretty good at making predictions about trajectories, given no black swan or other -unforeseen circumstances. So BDAI is pretty useful for management to be able to see where we are and where we might be headed.

But, what about agency, or intentionality, or what today we generally call strategy? If we have enough past information of competitive successes and failures, BDAI is capable of helping leaders develop options. In some instances, in a large consumer products organization, for example, it is not difficult to imagine letting BDAI decide the optimal number of versions of a toothpaste brand, which will maximize performance in the marketplace, and even continue to optimize those decisions over time.

Yet, what happens when there is a genuine disruption in a marketplace, when new inventions shuffle the whole deck? If BDAI had been in place at Olympus Camera on the day that Steve Jobs introduced the iPhone, would the company’s management information system have warned leadership that the pocket camera industry, at that moment, was entering an irreversible swoon?

CEO’s Role- Wisdom and Innovation

Finally, we come to the two basic responsibilities that a CEO can perform that, as yet, BD and AI together cannot. The first is to make wise decisions over time that express a coherent vision. The second is to lead innovation. Famously, Steve Jobs had no interest in market research when imagining where Apple needed to go next. He thought in broad terms about what human beings might do with powerful new tools, and went about creating them. Sometimes, it took a while for people to get what Jobs was giving them, but eventually, he re-ordered the world.

Same for Elon Musk. Musk’s long arc in guiding Tesla from highly-ignored sports car, which financed the luxury Model S, which, in turn, made possible the 3, is now crushing an entire global industry. And, underneath it all, still not widely-perceived, is that Musk is also transforming the global electrical grid with a complete infrastructure of vast battery capacity.

Jobs, Musk and other disruptive founders built their organizations to maximize the value-creating potential of their visions. Those organizations are no less than the living, breathing manifestations of their founders’ identities and are as unique as the founders themselves.

After the Founder

Once the founders have departed, subsequent leaders, in order to maximize the quality of their decision-making, will always need to be aware of the identity that still pulses at the heart of their organizations. Without this essential understanding, the dangers are ever-present that the easy persuasiveness of Big Data, married to the seemingly incontrovertible direction supplied by Artificial Intelligence will, eventually, lead even the most successful organization astray.

So, are CEOs really necessary anymore? Yes, if they realize that their main job is to ensure that the identity of their institutions provides the center of gravity around which Big Data and AI are reliably deployed. Otherwise, companies are in peril of becoming driverless, autonomous vehicles, subject to an uncertain future fraught with potentially lethal hazards.


About the Author

StrategyDriven Expert Contributor | Gerald SindellGerald Sindell is a partner of The Identity Dynamics Institute. He was the CEO of two New York publishing companies, Tudor and Knightsbridge. He has been instrumental in developing enterprise operating systems for EOS Worldwide, Accenture, and The Balanced Scorecard Institute.

Key Office Trends in the New Normal

StrategyDriven Editorial Perspective Article |New Normal|Key Office Trends in the New NormalBefore the COVID-19 pandemic, companies were used to having employees commute to work every day and spending all day long within office premises. The coming of the pandemic toppled many age-old workplace institutions and norms as companies shifted to remote working. Businesses and employees had to adapt to new ways of working as the old norms vanished, ushering in a new age of novel technologies like Zoom.

Before the pandemic, only 12 percent of American workers were working remotely full-time. This figure is even higher than the six percent in the UK. Even with the vaccines’ distribution spurring some optimism of returning to normal business, experts have indicated that traditional office life and the pre-pandemic working station may no longer be attainable.

Flexible Working Solution

The working environment and culture, how we know it is changing every day, and many people are still figuring out the perfect set up. Remote working has seen workers set up serviced offices in their homes, while others share space with the rest of the family. Employees at major tech companies like Google and Twitter are already working from home as long as they wish, with an option of switching to remote working permanently.

In addition to protecting their employees, companies will have to rethink the office space and arrangement to maintain comfort and flexibility. Companies will also have to contend with some employees juggling between remote working and commuting to the office.

Wellbeing and Human-Centric Design

Another significant shift in the new normal will be creating office designs emphasizing employee safety and empowerment. The pandemic has shown the value of minding others’ health and wellbeing, and companies will be paying attention to this as they reset the office.

The post-pandemic office design will consider things like handwashing facilities, built-in social distancing, flow management, readily available PPEs, and hand sanitizer points. This will not only protect employees but will also boost employees’ confidence and alleviate anxieties.

The use of Technology

Office design will prioritize smart buildings that collect and share information on when and how to use different spaces. This is very important in the context of social distancing, contact tracing, and space rationalization.

Companies are installing touchless technologies and thermal imaging systems to safeguard employees from the transmission and allow easy movements around within the office premises. There are also desk booking apps that ensure only one person touches an individual desk each day.

Connectivity and Community

The new normal will see an upraise of design models that emphasize hospitality and leisure. There is a focus on amenities and communal experience to attract and retain the best talents. The pandemic highlighted the importance of interpersonal relationships and how interactions are essential both at work and in personal life.

Reassessing Company Values

The pandemic has forced companies to rethink and reassess their values and principles. The lessons learned from the COVID-19 pandemic will cause companies to execute necessary internal shifts to adapt to the new normal.