Experimenting with supercomputers in outer space

On the 14th of August this year, Hewlett Packard Enterprise sent a supercomputer to the International Space Station (ISS) using SpaceX’s resupplying trip. The supercomputer is a part of an experiment in order to see how such a sophisticated machine behaves in outer space for a year without any modifications – the relative time it might take us to reach Mars.

Dr. Frank Fernandez, Hewlett Packard’s head payload engineer, explained that computers deployed at the ISS are always modified. Unfortunately, computers that are fit to work in space go through extensive improvements and are usually several generations behind those we use today. This means complex tasks are still done on Earth and transmitted back to the ISS. This model is working for now, but the longer the distance the more time it would take transmissions from Earth to reach their destination.

Taking into account the complexity of a trip to and a landing on Mars, it is crucial to assure instant and advanced computational power – not one that is 5 or even 2 years old. The idea of this experiment is to see whether there will be deviations in processor speed and memory refresh rate and make adjustments wherever necessary in order for results to be optimal.

Dr. Fernandez said that error detection and correction can be done through built-in hardware when it’s given enough time. When exploring space, however, time is sometimes a luxury. The main idea is to have the computing tools on board of the spacecraft. The more time is saved on computations, the more different experimentation tasks can take place in outer space. Another benefit is less bandwidth usage on the network between Earth and ISS, since a lot of it is used to transmit computational data. If the processing power is in the station itself, this additional bandwidth can be used for a different purpose.

In the end, the goal is to find the optimal technology we currently have to aid our efforts in space exploration.

Source

Image source

Enernet – the basis for smart cities

If you are thinking about profitable and secure long-term investment, the energy sector might be what you are looking for. Nearly 30 years ago the hot topic was the internet, but today it’s the enernet. Let’s first clarify what is enernet – a dynamic, distributed multi-party energy network built around clean energy production, storage and distribution and serving as major foundation for smart cities.

In large, oligopolistic markets the only way to compete with the establishment is through innovation. In the 90s such small-sized innovators were able to fundamentally change the way we access the internet – from analog to digital via existing cable networks.

Today, for example, SolarCity is trying to drastically reduce and potentially get us off our nuclear power and fossil fuel needs. By using smart equipment the dependency on the existing power grid is also reduced. This very same grid is also costly to maintain and not prepared to meet the energy needs of tomorrow. The mind behind SolarCity is one of the leading visionaries today – Elon Musk. His more famous companies are Tesla and SpaceX, like SolarCity, aiming at leading humanity towards sustainable future. Recently SolarCity introduced solar roof tiles that through a home system with storage and electric vehicle charging can help houses (and cars) become independent energy-wise.

Innovation goes beyond Musk’s visionary companies – enernet discoveries are only beginning to emerge. Currently there are nanogrids, microgrids, distributed energy resources and virtual power plants under development. These inventions will drive down production costs for many businesses. Also, it will make the investment in energy self-sufficient home even more feasible, as it will pay-off the initial expense much faster.

The logical outcome of these discoveries is smart cities that are healthier and more secure. Development will be multidimensional, as energy security and grid stability will be able to withstand superstorms and cyberattacks.

There is, of course, doubt that it all will be sunshine and roses. Pessimists stick to the idea that it would take too much time and resources to implement these technologies in our daily lives. Looking at the internet delivery system again – the transformation from analog to digital was ridiculously fast and cheap. Economies of scale will continue accelerating this process until every home is riding “the green wave”.

Utility companies that have electric grids and distribution networks have a huge opportunity in front of them to be still relevant. Why should they not feel any pressure? Today the market is open, but the initial investment is so huge that no small- or mid-sized enterprises cannot even consider entering. If established businesses approach the situation intelligently, they will still generate money, not nearly as much as in the current oligopolistic situation, but enough to be profitable.

Fossil-fuel companies are also at risk of losing their enormous profits, but as one of the leading polluters to the planet we can trade our species’ existence for corporate profits.

Enernet innovation, like most other innovations, is usually not grown but acquired. There is simply no other way to go, as it’s cleaner and cheaper.

Source

Image source

One more business-oriented view on algorithms

In this article we would like to discuss some of the business significance of artificial intelligence or how complex analytics tools help data scientists make sense of large amounts of information whether it is historical, transactional or machine-generated.

When used properly, these algorithms help detect previously undetectable patterns for example customer buying behavior or similarities between allegedly different cyber attacks.

One big problem related to algorithms is their short life span. Besides great math, one also needs to be constantly aware which algorithms are becoming obsolete and replace them with new better ones. This needs to be done continuously and quickly as in today’s complex business environment only the most high-yielding algorithms will survive.

In a hacker attack the defense system is updated to neutralize the threat. But hackers always come back with different approach. In order to stay secure, companies have to develop algorithms at least as quickly as the cybercriminals. On Wall Street even the best algorithms are profitable for 6 weeks at the most. In this period competitors are able to reverse-engineer and exploit it. Algorithmic efficiency is what’s most important in cybersecurity. One well-known case is with a company named Target – it’s systems were able to detect the hack but they had no algorithm for separating the real hacking from unimportant errors as the spewed information was simply too voluminous.

In the financial services stakes are extremely high so algorithmic fraud detection is also highly developed. Knight Capital, an American global financial services company, lost approximately $440 in 45 minutes back in 2012. Needless to say, this company is non-existent, whereas algorithm innovation only becomes larger.

It’s essential to come up with a way of easily compiling data with a sieve for enormous amounts of irrelevant information. As systems present new patterns of failure, one can improve on what was already done with an innovative approach.

Another field that shows rapid algorithmic development is predictive maintenance. Internet of Things is expected to be part of our household items by 2020, but is already seeing industrial use. Algorithms are developed to identify initial signs of system failure and alert the command center.

There are a number of tools that help in the development of algorithms:

  • visual analytics – relates to pattern recognition being used in real time to explore enormous amounts of historical data and connect usable models
  • streaming analytics – inserting algorithms directly into streaming data in order to monitor it live and isolate patterns or detect potential threats
  • predictive analytics networks – a place where data scientists help each other and polish their algorithms with a degree of reciprocity
  • continuous streaming data marts – used to monitor an algorithm in real time with the possibility of calibrating it

Top companies are continuously improving on their algorithms. In an environment they created, only the fittest algorithms survive and they will drive smarter and sounder business choices.

Source

Image source

Some major challenges the IoT will bring

In our previous posts we mentioned predictions for around 20 billion IoT devices connected by 2020. Today this forecast may even be pessimistic, since the Business Insider quotes numbers above 30 billion. There will be an enormous opportunity for better energy efficiency and data security. New challenges regarding the rapid growth of IoT have to be quickly understood as they might drastically slow the implementation process.

In this article we will examine some of these challenges.

Device authentication

One very important feature of IoT ecosystem security is identifying devices and thus preventing “outsiders” from entering it. Today authentication is done on cloud-based servers, proving a reliable choice when there are tens of devices connected. However, if you pile up thousands or millions of IoT devices, authentication can become a liability. In terms of security we are not advanced enough as current practices include Internet connection which drains batteries and is totally useless when the connection drops. On top of that, people in general do not understand the issue with scaling, according to Ken Tola, the CEO of the IoT security startup Phantom. He says that working on a peer-base could easily handle big scalability. This means moving functionality between IoT devices – whenever authentication is necessary, it can happen at the same time between millions of devices without requiring Internet connection.

The same startup is working on M2M (machine-to-machine) connection which is a security layer able to identify two devices in peer-based manner, identifying levels and types of communication between them.

Wireless communications

IoT is the logical expansion of the Internet from our computers to our appliances. It will digitize some of our everyday activities via wireless connectivity. Majority of IoT devices depend on radio frequencies such as Bluetooth and Wi-Fi. However, RF-based devices are shutting each other down due to interference and this problem will only grow with the addition of other IoT appliances. One current solution can consist of an additional bandwidth of 5GHz for Wi-Fi, but as projected number of devices grow more and more, interference will persist. Another issue relates to power consumption, as IoT items use batteries.

An alternative is instead of using Near Field Magnetic Induction (NFMI) for data transfer to be substituted with RF.NMFI, whose signal decays much faster and thus much of the interference is gone. NFMI creates a wireless “bubble” in which IoT devices connect and outside signals are ignored. Also, security protocols are active within the bubble, drastically reducing threats, while fast signal decay allows the same frequency to be used for a different device. On a side note, NFMI has been used in hearing aids and pacemakers for more than a decade, but it might be the key to revolutionizing IoT.

Traffic administration

Managing IoT devices can quickly become impossible if it is not taken into account at the earliest stages of implementation. Smart homes are one thing, but we will live in smart cities where parking automates or traffic sensors will also transmit data. Administration and integration should be as simple as possible. The biggest potential issue is many devices transmitting data at the same time, but as we mentioned, RF.NMFI might be the key.

Startups use machine learning (Artificial Intelligence) for managing complex automated networks that will consist of thousands IoT devices. The algorithm will provide real-time distribution system control and self-managing means for big networks,geographically spread over vast distances.

Most of the technologies we use will further develop to a level supporting the needs of an interconnected world. Those that are left behind will be replaced by ones ready to take on the challenges and opportunities the IoT will bring.

Source

Image source

Fintech vs Banking – technology contesting the old establishment

The banking industry is facing a big challenge. It’s related to technological advancements and how they continuously alter consumer behavior. Two major trends are noticeable – transitioning from physical to digital money and a lot higher demand for convenience on the customer side.

Gaining back market presence

In the recent years banks’ innovation was sluggish and that’s putting it mildly. Paypal, Square and Mint were visionary enough to build services that digital users adopt and use on a daily basis. Today you can use Facebook Messanger or Venmo to send money to a friend rather than a bank app. Still, despite the progress in fintech, banks are not going anywhere in the foreseeable future. Money-transfer systems are too difficult for any startup to substitute, even Apple Pay is based on existing payment rails. If banks do not change their customer-facing systems, however, they will initially be providing the engine for fintech companies and later on possibly disappear.

Feeling too secure left banks in lethargy

Banks still feel secure – huge initial investment is by itself an enormous entry barrier in this business. There is also still high demand for ATMs and branches which is why these financial institutions have been somewhat “shielded” from technological progress. This security made them less consumer satisfaction oriented and more focused on driving down their internal costs and cross-sell services.

Today, however, money transactions have become almost entirely non-physical and what customers need almost as much as security is convenience. It is no surprise that mobile payments integrated in different platforms enjoy big success. Banks will have to either develop their own mobile solutions, buy already existing apps or become highly integrated with the backbone of such software – otherwise we might simply outgrow them. According to JDPowers.com, retail banking satisfaction dropped in 2015. Development in the fintech has elevated consumer requirements to a more personalized service while at the same time reduced initial costs for new entrants. This most definitely puts banks in a position they would rather not be in.

It is no surprise that Venmo, LendingClub and Wealthfront have already stolen some profitable segments of an industry big banks spent millions to consolidate. Goldman Sachs even put a number to it – $470 billion in profits earned by fintech firms in 2015.

Banking in the following years

In order to stay in the game banks will also need to form new strategic alliances. For example, Messenger and Snapchat are quickly expanding their functionalities and these companies alongside many others will continue developing their financial services prospects. They already have the most important thing – large customer base, satisfied with their core services – adding more useful functionalities will be easily accepted. One extra button on their apps could replace whole banking institutions.

Some banks will choose to invest deeply in coming up with their own software – like Captal One with their Digital Lab. Others will more likely use their enormous resources to acquire fintech startups and provide an outside product to their customer base. In the next years customers will choose their financial institution based on product experience and not on value adding perks. Banks will not change fundamentally, but they will have to provide the best possible experience instead of focusing on their profitability.

Source

Image source

Virtual Reality reshaping business

This year we will witness Virtual Reality’s (VR) adoption in many aspects of our lives. In January Oculus Rift announced a $599 price with shipments expected to start at the end of the first Quarter. HTC and Sony are not falling behind as their estimated shipment dates are also in the first half of 2016.

Starting at the price of a new high-end smart phone, VR headsets are rather affordable. The research company Juniper estimated a 3 million VR device shipments in 2016 with amount rising up to 30 million by 2020.It not surprising that VR headsets initially became popular due to the groundbreaking gaming experience they provide. However, in this article we would like to look at an even larger sector which will be making use of VR – the enterprise market.

When it comes to VR changing the business world, sky is the limit. Unfortunately there is very little information on how and in which industry this is already happening – perhaps because it is not as interesting as playing the newest Final Fantasy game on your VR device. Lets examine some relevant examples:

Real estate

This industry has always had the reputation of late technology adopter. Now it is about to change – brokers can use VR to speed up sales or leases. Imagine brokers having access to all their offered locations without having to leave their office. Floored INC is a company that offers VR and 3D modeling programs. Its software was used in an Oculus Rift headset to help another company sell a building in Manhattan before it was finished.

Medicine

In this field there have already been advanced simulators for a number of years, however, only now there are affordable VR devices. In January 2016 a group of surgeons were able to perform open heart surgery on a four-month-old baby thanks to VR imaging software and Google Cardboard Viewer. Devices like Oculus Rift or Google Cardboard allow doctors to practice everything from simple day-to-day activities to rare surgical interventions. The technology is also advantageous to patients who can learn about what was performed on them and what to do to properly recover.

Manufacturing

Today product development teams are using VR in order to improve engineering, design, manufacturing and operations. Ford’s Immersive Vehicle Environment Lab has undertaken a significant improvement in productivity and a drop in costs due to it implementing virtual testing and prototyping. Instead of spending hundreds of thousands of dollars to produce a physical prototype, companies can use VR labs where new concepts can be virtually tested.

VR in enterprise is not seeing nearly as much use as we are witnessing in entertainment. However, our prediction is that business-related use will become the leader in the upcoming years as businesses need more time to adapt to it. But for this to be feasible, we also need expansion In enterprise VR software and content. The future holds virtual training and certification for any position anywhere on the globe. The prospects of VR are limitless and we are eager to find out how this technology will reshape business.

Source

Image source

Robots changing the ways we conduct business

The new generation of robots is already here, mainly in warehouses and factories, having huge impacts on productivity and logistics. These machines are using the latest developments in software and artificial intelligence, revolutionizing the way goods are produced and distributed.

One other notable development, already transforming industries is Vision Guided Vehicles (VGVs). They are capable of transporting heavy shipments or ensuring same-day deliveries, reducing human labor costs to a minimum. On a larger scale, VGVs are already seeing use in the automotive industry and retail.
In this article we will examine three companies in the autonomous robot industry that are drawing a lot of attention. We are very much interested how these inventions will improve supply chains in different business sectors over time.

Starship’s ground drone

Starship Technologies is introducing autonomous robots that deal with the hardest part of the delivery process – the “last mile”. The purpose of these machines is to deliver groceries and small packages to suburban households. Operating autonomously almost the entire time, each robot will use navigation software and a camera plus radar to avoid obstructions. It will also have speakers and a microphone so real-time communication is possible. Climbing small staircases is also viable due to their six-wheel treads.

The company is planning to launch a beta program in both the US and the UK, so if everything goes as planned we might see these machines on sidewalks relatively soon.

Tally the retail robot

Simbe Robotics is currently creating a tall, cylindrical robot that monitors grocery store inventory, making sure items are in the correct section, properly stocked and priced. One Tally robot can look through 15,000 items per hour, pausing only to take a high resolution pictures. The machine still cannot fix shortcomings it notices, but it transfers the data it collects to the cloud for further examination, then presents potential solutions to retailers via mobile app.

One huge advantage to this technology, which also applies for the VGVs, is that it does not require new infrastructure to do its tasks. It can also work just fine around customers and other staff members even during busy business hours. It is also programmed to go back to the charging dock when low on power.

Prospero Farming bots

With human population increasing every year, effectively growing, harvesting and distributing food is now a global problem. It is estimated that global population will rise with another 2 billion people by 2050. David Dorhour is working on a robot that will revolutionize agriculture.

Prospero is a small, six-legged robot, which when joined with hundreds other ones can quickly and systematically plant acres of land. It won’t be using GPS or visual guidance, it only sees directly underneath. When it detects seedless piece of soil, it plants at precise depth, sprays it with preprogrammed amount of fertilizer, covers it and sprays another fluid which lets other Prosperos know this area is already planted.
Each robot will be equipped with radio communication so when one spots larger uncultivated patch of land it can call for backup. On the same principle if it finds an already planted area it will let others know to work somewhere else.

Instead of constantly monitoring each other’s position Prospero will be using simple communication. This concept greatly reduces computational power and makes it also highly usable in rural areas. Farmers will have the total control – instead of making decisions field by field they could do it for each plant.

Even though industries tomorrow will be very altered, humans and robots will work in collaboration. Machines can excel at doing repetitive and monotonous work, but there are no replacements to human creativity and intellect.

Everyone should embrace this new technology which will lead to a brighter future. Yes, in the short run many people will lose their jobs but it is the cost of progress.

Source

Image source

The era of open-source software has come

In the way business is conducted nowadays one thing stands out – the code is not one of the most important assets anymore. How it is used to connect customers is what really matters. Just look at 3D printing or Uber and you will find out that these cutting-edge technologies run on open-source software.

Some 20 years ago licensed software single-handedly created monopolies and generated a lot of revenue. Today open-source code facilitates economic growth. We should keep in mind that worldwide shifts from production-based to service-based economies(with some exceptions) are taking place. Logically, in the software field the goal of creating and owning code has changed towards serving customers. Similar to hardware components, coding languages hold little value by themselves anymore. The functionalities of the code separate small companies from those valued in the billions. In 2014 Tesla released all its patents, fast-tracking a new dawn in tech without hindering its global leader spot.

Today a software company has to position itself in an environment where the old model has changed and it must be the “go-to” service attracting millions of customers very fast.In June this year Apple announced that it will be opening the source of its programming language – Swift. This action proved that even the biggest players are supporting a new way of conducting business. Forcing one language for a single platform is no longer the case.

Many recently created programming languages have open-source basis for example Google’s Go and Mozilla’s Rust. Another big player like Microsoft opened the source of its .NET platform, allowing developers to work outside of the company’s ecosystem. This collective effort opens numerous opportunities – for example new ways of optimizing computational capacity.

Value in this new environment lies in the balance between empowering and serving your end consumers. Uber using Python, Node JS and other open-source systems has almost destroyed the taxi industry. Red Hat provides all of its code and accumulates revenue through enterprise services.
On-demand services like Netflix have shifted consumer behavior to a state where high level of services become the norm. Open-source is the only way to really get to know exactly what your customers want. Also, open-source tools are cheaper, faster and more powerful. One huge challenge to already established businesses is to adjust to this new way of conducting business.

When integrating an open-source player it is very important not to reduce the purity of the brand. When Microsoft acquired Revolution Analytics the latter assured its customers that it will continue to develop and support its products for every platform, including non-Windows ones.

In the software game there are three simple questions which if you can answer – then you are in it for the long run. What service does your business own and who and what does it connect ?

It can be only done through open-source software.

Source

Image source

Security issues with the Internet of Things

Not too long ago it would have been a wild guess to assume that your phone will be able to do everything your computer can. Couple years from now your toaster will most likely also have cool new functionalities. These new possibilities, however, provide also complete different angles to cyber security. The Internet of Things (IoT) era is upon us and millions of devices from your office to your car will be connected. Introducing the IPv6 and ever-increasing amounts of wi-fi networks are the main predispositions for an estimation of 40 billion IoT devices by 2020. However, we must act adequately to anticipate threats, since more connected devices present more avenues for potential attacks.

The good news is this issue has already caught the attention of security specialists. In October this year researchers were able to notice susceptibilities in different brands IoT baby monitors which could be used by hackers to gain access to live feeds or tamper with camera settings. Another security breach was found in Internet connected cars. Hackers have the capability to take control of the entertainment system, unlock the doors or even turn off the automobile while in motion. All smart devices pose a threat to your data, even wearables – hackers can use motion sensors to access what you are typing or gather health data through installed apps. Worst of all is possible hindering of people’s health through medical device breaches.

Serious actions have already been taken on a federal level in the US in order to reduce the gaps against malicious software. Security firms and manufacturers are joining their efforts In IoT security before the industry goes out of control in its infant stage. Gemalto, a digital security company is using its experience in mobile payments to secure IoT devices. The company will be offering Secure Element (SE) technology to other firms in automotive and utility sectors. SE is tamper-resistant component that gets implemented into devices to allow sophisticated digital security through encryption and access control of crucial data.

Microsoft has also addressed the IoT security issue as it will be adding BitLocker encryption and Secure Boot technology to the Windows 10 IoT – the company’s operating system for IoT devices. BitLocker is encryption technology that can cypher entire disk volumes. Its function is to protect on-device data. Secure Boot is software developed by PC making companies in order to ensure your PC is booting with the software the manufacturer recommends. This is an anti-hijacking software.

Vodafone (along with other tech players) has been very active in the forming of the IoT Security Foundation which is a non-profit organization examining Internet connected devices for flaws and offer security advice to involved parties. It hopes to raise awareness through cross-company collaboration.

While there are notable efforts in the right direction, a lot more needs to be done. Gateways connecting IoT devices to other networks need to be secured, not only the devices. IoT gadgets are always connected and they only go through one-time authentication which makes them a perfect tool for infiltrating company networks. At this point the security of people or organizations is not ensured. Repositories containing big IoT data are also at risk to hackers relying on big data to run their schemes. Recent breaches resulting in data theft should serve as a wake up call. We need unified planning on how to tackle security issues through security updates. Manual installation on numerous devices is certainly out of the question, but automatic updates can also pose security threats.

It is evident that IoT devices will become an inseparable part of our lives sooner rather than later and it is up to the whole tech community to raise awareness and come up with sustainable solutions.

Source

Image source

Hacking today is becoming a centralized business

When asked “How do you think a hacker looks like?” most people will definitely associate it with any of the movies they’ve seen depicting that world. In reality they are experts in covering their identity, but you definitely won’t imagine a neatly dressed executive working for a well-organized business aiming at optimal return on investment. Reality is weirder than you think.

Today hackers have reached a level far beyond the individual know-it-all portrayed in movies, being a part of a whole cybercrime supply chain, generating billions of dollars each year. The massive data breaches we have observed in recent years are the strongest evidence that hackers developed enterprise-mimicing operations on a global scale. One plausible idea is that by stealing billions from world-leading brands is just a demonstration of what’s possible. Besides allegedly flexing muscles, the most skilful hackers being able to stay undetected will naturally remain unreported and very, very rich.

With the increase of software use in virtually any aspect of our life, combined with many types of hackers eager to exploit vulnerabilities it can feel like an uphill battle. It is very likely that security specialists will never be able to win this fight.

Before huge data outbreaks one common belief about hackers was that they were only looking for credit card information to either use themselves or sell to the highest bidder. Still this is a big part of fraudulent processes, but the same kind of attacks can also provide account credentials which are even more valuable. Once those credentials are collected information like user name, address, phone, email accounts, methods of payment, financial and business contracts can be obtained and used or sold.

Maximizing profit is the goal of any business including ones in the field of fraud. The goal is to reach consistent results with minimal effort and money. Most hacker groups go for account credentials providing all sort of private data which is inevitably online – mortgage accounts, tax data etc.

The initial breach of data is just the start. Rarely will the person that stole your data use it by himself. More often other hackers buy it and focus on specific parts of it corresponding to their set of skills. It may sound weird, but today it is more important to protect your email rather than your credit or debit card info. The information transaction is done in the deep web with websites changing all the time. Supply and demand of fraudulent data still have reliable mediums of exchange and it is no surprise that this market is thriving. It is notable that fraudulent data market is also driven by reputation. The higher quality of stolen data, the more repeat sales, combined with an increase of initial price.

Why is it more important to protect your email rather than you bank cards? Imagine this: 5 years ago you sent a loan application via email. When stolen, hackers have all the required data to create false identities or get loans themselves with disastrous consequences.

Now lets examine the fraudulent supply chain. In it, getting a hold of the credentials is just the beginning. Like in any other business, the offered goods have to make their way through the supply chain before reaching the end consumer. There are four general kinds of hackers with many more specializing in different areas. The first type includes hackers who acquire credentials. Their expertise is finding vulnerable spots like third-party vendors that do business with their target. Their scam pallet is large with techniques such as phishing scams and social engineering. Their main goal is to find human weakness which allows to take advantage of a system. The second type consists of hackers who sell the stolen data. Most often this is done via online hacker forums, part of the deep web. Prices are set according to the value and longevity of offered information. Next, there are the ones which specialize in using credentials to isolate consumable goods – credit card numbers, personality identifiable information, intellectual property and more. The last type includes hackers who are proficient in using the stolen data. Their main goal is to bypass fraud monitors so they can actually monetarize the stolen information. The conclusion is that there are many organizations and people that support the hacker ecosystem directly or indirectly, but what is actually scarier here is that some of these people might not even realize it.

Companies are inefficient in implementing best practices and appropriate technologies in protecting places that are of highest importance – like email. One very effective method is adopting two factor authentication whenever possible. Stolen credit card information on mobile devices are becoming harder to trace. Mobile devices are home to a number of private and business-related apps containing sensitive information.

The good news is that there are many business nowadays that develop new systems to prevent hackers from exploits in the places they work most. Some governments are also taking steps in assisting their citizens against cyber crimes. The bad news is that hackers have shown over and over they are great at adapting to changes and staying ahead of security specialists. It is up to us to protect ourselves.