Daily Tech Digest - January 08, 2019

5G versus 4G: How speed, latency and application support differ

4g versus 5g compare fruit apples to apples
5G uses new and so far rarely used radio millimeter bands in the 30 GHz to 300 GHz range. Current 4G networks operate on frequencies below 6GHz. Low latency is one of 5G's most important attributes, making the technology highly suitable for critical applications that require rapid responsiveness, such as remote vehicle control. 5G networks are capable of latency rates of under a millisecond in ideal conditions. 4G latency varies from carrier to carrier and cell to cell. Still, on the whole, 5G is estimated to be 60 to 120 times faster than average 4G latencies. Over time, 5G is expected to advance wireless networking by bringing fiber-like speeds and extremely low latency capabilities to almost any location. In terms of peak speed, 5G is approximately 20 times faster than 4G. The new technology also offers a minimum peak download speed of 20 Gb/s (while 4G pokes along at only 1 Gb/s). Generally speaking, fixed site users, such as offices and homes, will experience somewhat higher speeds than mobile users.



This old ransomware is using an unpleasant new trick to try and make you pay up

This ransomware attack begins, like many others, with brute force attacks targeting weak passwords on RDP ports. Once inside the network, the attackers harvest the admin credentials required to move across the network before encrypting servers and wiping back-ups. Victims are then presented with a ransom note that tells them to send an email to the ransomware distributors, who also warn victims not to use any security software against CryptoMix, with the attackers claiming that this could permanently damage the system (a common tactic used by attackers to dissuade victims from using security software to restore their computer). But if a victim engages with the attackers over email, they'll find out that those behind CryptoMix claim that the money made from the ransom demand -- usually two or three bitcoins -- will be donated to charity. Obviously, this isn't the case, but in an effort to lure victims into believing the scam, the CryptoMix distributors appear to have taken information about real children from crowdfunding and local news websites.


DevOps to DevSecOps adjustment banks on cross-group collaboration


Eventually, the CIO and chief information security officer (CISO) must participate in the DevOps discussion. This is especially true when a development project needs to be compliant with the Sarbanes-Oxley Act, Health Insurance Portability and Accountability Act or other compliance standards. This discussion needs to cover how teams can work together to achieve IT delivery goals with the best use of resources. For Peterson, the borderless environment of the cloud makes it tougher for CISOs and their teams to keep an organization secure. Security, development and operations teams must agree to communicate and share knowledge across domains. Because so many people have roles in a company's security strategy, Sadin suggested organizations designate a single point person for risk or security -- DevOps shop or not. But don't let security silo itself, Rowley cautioned. When security becomes so removed from the process that it doesn't function well with the other C-level executives or departments, security will fail.


HQ 2.0: The Next-Generation Corporate Center


In short, the economies of scale associated with centralized services have eroded. The functional security blankets justifying their expense no longer apply; it is not enough anymore to meet regulatory requirements and provide basic internal services. The business units and the new generation of talent are demanding more. Finally, the combination of new digital Industry 4.0–style platforms, robotics, intelligent machines, and advanced analytics are allowing companies to harness the explosion of data and fundamentally alter how and where work gets done. ... The new corporate center will be smaller, but it will still have executive and functional leaders and their staffs, mostly limited to five ongoing roles. First, they will define and communicate the company vision, values, and identity. Second, they will develop the corporate strategy and be responsible for the necessary critical enterprise-wide capabilities. Third, they will oversee the business unit portfolio, and related legal, regulatory, and fiduciary activities. Fourth, they will allocate capital.


Don’t Panic: Biometric Security is Still Secure for Enterprises

Don't Panic: Biometric Security is Still Secure for Enterprises
The early assumptions surrounding biometric security—that it could supplant passwords—fuels the current panic in the discourse. However, biometric security remains secure so long as your enterprise treats it as another layer in your overall authentication platform. When you incorporate biometric security into your two-factor authentication, your access management becomes stronger; hackers will have to acquire both your employees’ passwords and their biometric information to try and break into the network.  However, two-factor authentication faces its own scrutiny. Hackers have found ways to subvert the traditional authentication use of mobile devices and insert themselves into the authentication process. Therefore, enterprises embrace multi-factor authentication (MFA) for its more layered approach to access management. Additionally, multifactor authentication can be applied in a granular fashion. Your regular employees may only require two-factor authentication, whereas your most privileged users may need as many as five factors to access your sensitive digital assets.


Threat of a Remote Cyberattack on Today's Aircraft Is Real

Responding to the attack, Boeing issued a multiparagraph statement that included this passage: "Boeing is confident in the cyber-security measures of its airplanes. … Boeing's cyber-security measures … meet or exceed all applicable regulatory standards." ... To solve it, we need industry regulations that require updated cybersecurity policies and protocols, including mandatory penetration testing by aviation experts who are independent of manufacturers, vendors, service providers and aircraft operators. Be mindful of those who claim aviation expertise; few have the necessary experience, but many claim they do. "Pen testing" is essentially what DHS experts were conducting during the Boeing 757 attack. A pen test is a simulated attack on a computer system that identifies its vulnerabilities and strengths. Pen testing is one of many ways to mitigate risk, and we need more trained aviation and cyber personnel to deal with the current and emerging cyber threats — those that haven't even been conceived of yet.


Enterprise search trends to look for in 2019

Enterprise search trends to look for in 2019 image
Imagine if your company’s Intranet search were as easy, personalised, and contextual as Google’s Internet search. Cognitive search will help make this a reality by giving enterprise users the ability to locate truly relevant text, image, and video files from within large volumes of both internal and external data. One the biggest challenges facing enterprise search is the nature of much of the data. Gartner estimates that 80% of organisational data is unstructured, meaning that it doesn’t adhere to predetermined models. This results in irregularities and ambiguities that can make it difficult to find using traditional search programs. AI programs help automatically tag this unstructured information, making it much more easily discoverable. Cognitive search also improves accuracy by considering the context of each query. By examining and learning from past searches, these types of systems can identify the person who is looking for the information and what type of content the person is expecting to find.


IoT devices proliferate, from smart bulbs to industrial vibration sensors

IoT devices proliferate, from smart bulbs to industrial vibration sensors
Arguably the biggest and most-established use in this area is preventive maintenance, usually in an industrial setting. The concept is simple, but it relies on a lot of clever computational work and careful integration. Preventive maintenance uses gadgets like vibration and wear sensors to measure the stresses on and performance of factory equipment. For example, in a turbine those sensors feed their data into software running on either an edge device sitting somewhere on the factory floor for quick communication with the endpoint or on a server somewhere in the data center or cloud. Once there, the data can be parsed by a machine-learning system that correlates real-time data with historical, enabling the detection of potential reliability issues without the need for human inspection. Fleet management’s another popular use case for IoT devices. These systems either take advantage of a GPS locator already installed on a car or add a new one for the purpose, sending that data via cellular network back to the company, allowing rental car firms or really any company with a large number of cars or trucks to keep track of their movements.


A Framework in C# for Fingerprint Verification


Fingerprint recognition is an active research area nowadays. An important component in fingerprint recognition systems is the fingerprint matching algorithm. According to the problem domain, fingerprint matching algorithms are classified in two categories: fingerprint verification algorithms and fingerprint identification algorithms. The aim of fingerprint verification algorithms is to determine whether two fingerprints come from the same finger or not. On the other hand, the fingerprint identification algorithms search a query fingerprint in a database looking for the fingerprints coming from the same finger. There are hundreds of papers concerning fingerprint verification but, as far as we know, there is not any framework for fingerprint verification available on the web. So, you must implement your own tools in order to test the performance of your fingerprint verification algorithms. Moreover, you must spend a lot of time implementing algorithms of other authors to compare with your algorithms.


Towards Successful Resilient Software Design

What is different these days, is the fact, that almost every system is a distributed system. Systems talk to each other all the time and also usually the systems themselves are split up in remote parts that do the same. Developments like microservices, mobile computing and (I)IoT multiply the connections between collaborating system parts, i.e., take that development to the next level. The remote communication needed to let the systems and their parts talk to each other implies failure modes that only exist across process boundaries, not inside a process. These failure modes like, e.g., non-responsiveness, latency, incomplete or out-of-order messages will cause all kinds of undesired failures on the application level if we ignore their existence. In other words, ignoring the effects of distribution is not an option if you need a robust, highly available systems. This leads me to the “what” of RSD: I tend to define resilient software design as “designing an application in a way that ideally a user does not notice at all if an unexpected failure occurs or that the user at least can continue to use the application with a defined reduced functional scope”.



Quote for the day:


"The quality of a leader is reflected in the standards they set for themselves." -- Ray Kroc


Daily Tech Digest - January 07, 2019

Want a hybrid workforce? The trick is getting humans and machines speaking to each other

hybrid.jpg
A stealth company is trying to solve one of the oddest interoperability problems of the modern era: How do you get robots and non-engineers talking to each other? Founded by the former Director of Robotics for Google, the company, Formant, is making its first public bow thanks to a recently-announced $6 million in funding from SignalFire. Formant's pitch is straightforward, and it illustrates the peculiar problem of automation in 2019: Robots perform a lot of tasks in industries like logistics and manufacturing, but those industries still rely on humans for crucial decisions robots can't yet make. Getting robots and humans communicating in real time to facilitate that decision-making has been tricky and usually requires an intermediary in the form of an engineer. ... "We founded Formant to answer the biggest problem that faces automation today: robots produce too much information, in disparate forms that cannot be viewed simultaneously," said Jeff Linnell, founder, and CEO of Formant and a robotics insider with deep industry connections.


Expect banks and fintechs they trust to reach wider arrangements that give banks more confidence in the security surrounding data sharing and third-party innovation.  The Financial Data Exchange, made up of big banks like JPMorgan Chase and Wells Fargo as well as data aggregators and fintechs, was established in 2018 to create a standard to safely share information and address risks tied to open banking. The group — along with the “Secure Open Data Access” framework formed earlier this year with support from Envestnet’s Yodlee, Quovo and Morningstar's ByAllAccounts — could put banks more at ease with open banking.  Meanwhile, banks such as BBVA Compass, Capital One, Citibank and Silicon Valley Bank continue to move forward with open-banking initiatives. ... Meantime, blockchain advocates at financial institutions are weary of trying to convince others of the technology's potential. Blockchain proponents admit bank executives and regulators still link the technology to wild swings in cryptocurrency values.



The well-crafted phishing web pages use custom web font files known as “woff files” to implement a substitution cypher that makes the source code of phishing pages appear benign. When the phishing landing page renders in the browser, users are presented with a typical online banking credential phish using stolen bank branding, but includes encoded display text. Substitution functions in phishing kits are frequently implemented in JavaScript, the researchers said, adding that no such functions appeared in the page source. Instead, the researchers identified the source of the substitution in the CSS [cascading style sheet] code for the landing page. The researchers extracted, converted and viewed the woff and woff2 web font files to discover the phishing landing page was using those custom web font files to make the browser render the ciphertext as plaintext, while the malicious code remained hidden.



Top 4 enterprise tech trends to watch in 2019

Top 4 enterprise tech trends to watch out for in 2019
How can security be improved? Advances in cloud computing and blockchain will help organizations better protect their data, Climer wrote in a recent article. “Though these aren’t new technology trends — blockchain and the cloud led conversations throughout 2018 — how businesses utilize these tech tools for their operational security will likely transition dramatically,” she wrote. Jessica Marie, director of product marketing at Vera Security, also said tech advances in cybersecurity will help. “I'm most excited about advancements in cybersecurity, particularly encryption technologies and securing data in the cloud/collaboration tools. Something tells me with recent breaches, this might become very necessary,” she said during the Twitter chat. Data governance will also play a large role in improving cybersecurity and data privacy, said Tyler James Johnson, founder and CEO of PrivOps. “I'm big on data and analytics solutions, as well,” Johnson said. “I see a big connection between that and data privacy and security. For me, 2019 will be the year all these trends converge.”


Security and patching: 5 resolutions for 2019

Security and patching: 5 resolutions for 2019
The number of IT assets that companies have in place continues to go up, with more endpoint devices, servers and applications in place that all need to be kept up to date. At the same time, the number of known vulnerabilities continues to rise, and the amount of time to deploy the available patches is coming down. The amount of time between vulnerabilities getting announced and exploits becoming available is dropping. The reducing window makes it difficult to keep systems up to date when there are hundreds, thousands or even millions of assets to consider. The second issue is error proofing. Patches may break other applications, or introduce other flaws that lead to more security issues in the future. In some cases, they may not work or break the machines they are applied to. Whatever the output, a poorly applied patch may cause more harm. Testing to check that these failure conditions don't take place is therefore necessary to avoid problems coming up. The third issue is prioritization. With so many assets to look after and so many applications to test, it can be hard for teams to know where to put their efforts.


UK contactless card fraud doubles


Unlike chip and PIN transactions, contactless payments can be made without additional authentication, such as a PIN. Under current rules, payments of up to £30 can be made using the technology. Contactless is overtaking chip and PIN as the most popular way of paying for goods and service because of its convenience. According to recent figures from payment processing firm Worldpay, more card payments were made using contactless technology than chip and PIN in the UK over the 12 months from June 2017 to June 2018. It revealed that, after increasing by 30% on the previous year, contactless payments were the most used card payments in shops. ... “Fraudsters will do all they can to steal your card and account details and take money from your account. If you’ve seen unusual activity on your bank statements, such as purchases you don’t remember making or cash withdrawals from places you don’t remember visiting, tell your bank immediately.”


The attack surface is growing faster than it has at any other point in the history of technology

attack surface growth
In 2019, well known tactics such as advertising, phishing and fake apps will continue to dominate the mobile threat landscape. In 2018, we tracked and flagged countless fake apps using our apklab.io platform. Some were even found on the Google Play Store. Fake apps are the zombies in mobile security, becoming so ubiquitous that they barely even make the headlines as new fake apps pop up to take the place of the ones already flagged for removal. They will continue to persist as a trend in 2019, exacerbated by fake versions of popular app brands doing their rounds on the Google Play Store. In 2018, the return of banking Trojans was also particularly pronounced on the mobile side, growing 150 percent year-on-year, from three percent to over seven percent of all detections we see worldwide. While perhaps not a big shift in terms of the overall volume, we believe that cybercriminals are finding banking to be a more reliable way to make money than cryptomining.


Singapore Airlines data breach affects 285 accounts, exposes travel details

Singapore Airlines employees urged to innovate, fail without fear
"We have established that this was a one-off software bug and was not the result of an external party's breach of our systems or members' accounts. The period during which the incident occurred was between 2am and 12.15pm, Singapore time, on 4 January 2019, at which point the issue was resolved," the spokesperson said.  The airline said it will contact all affected customers and has "voluntarily informed" Singapore's Personal Data Protection Commission about the data breach. ... Upon contacting SIA's customer hotline, the SIA customer was informed by the call agent that the airline was performing a system upgrade and instructed to log out of her account and log back in after 24 hours. "Such incidents are unacceptable for a company as big as Singapore Airlines. How can you do a system upgrade without proper testing?" the customer had said. "It's frustrating that we're held hostage by these companies that demand our personal details, but don't keep the data safe. When you ask for my personal data, I expect you to have the technology and systems in place to keep it secured."


Expanding the boundaries of the digital workplace


One of the central principles in establishing a perimeterless digital workplace is that the network alone does not determine which services users can access. Unlike the perimeter-based security model, the decision to grant or deny access is not tightly bound to a physical location, IP address or the use of a virtual private network (VPN). Instead, user, device and other contextual data, such as threat signals, dynamically determine the appropriate access policy, which may trigger the need for multifactor authentication, access denial or other trust elevation techniques. User and contextual trust should be appropriate to the level of risk associated with the resource being accessed. This is best illustrated with an example of a user accessing sensitive data. Sometimes, the access to sensitive data – for example, company financials – might require the user to be a full-time employee using a fully managed device.


Super Charge the Module Aware Service Loader in Java 11

Java’s answer to provide developers the ability to design and implement extensible applications without modifying the original code base came in the form of services and the ServiceLoader class--introduced in Java version 6. SLF4J uses this service loading mechanism to provide its plug-in model that we described earlier. Of course, dependency injection or inversion of control frameworks are another way to achieve the same and more. But, we will focus on the native solution for the purpose of this article. ... The default ServiceLoader’s “load” method searches the application classpath with the default class loader. You can use the overloaded “load” method to pass a custom class loader to implement more sophisticated searches for service providers. In order for the ServiceLoader to locate service providers, the service providers should implement the service interface--in our case the PaymentService interface.



Quote for the day:


"Management is about arranging and telling. Leadership is about nurturing and enhancing." -- Tom Peters


Daily Tech Digest - January 06, 2019

Internet-of-Things
IoT devices are a favorite weapon for attackers who use them to penetrate local networks and conduct other attacks. As consumers slowly learn how to protect their PCs and mobile devices, they will also need to learn how to stay safe as more of their traditional appliances go online. The security industry, too, will have to adjust to this new reality. On a related note, as it is the network that many of these devices will eventually exist on, the gradual introduction of 5G is likely to bring challenges in 2019. For example, Verizon and Samsung have already announced that they will offer 5G smartphones in the U.S. This is a key issue because the telecom industry has always had a turbulent relationship with security. For example, although operators are well aware of potential issues, 78% of telecom networks are vulnerable to attacks. SMS interception, for example, is still possible in nine cases out of 10.



How the future of work may unfold: A corporate demand-side perspective

Corporate labour demand is estimated as a function of AI diffusion at the corporate level, based on answers to a global survey covering more than 3,000 executives across 14 sectors and ten countries. The survey answers were weighted based on the relative size of companies6 . The rough data suggest that a decline in employment is not inevitable, with only 19% of answers suggesting that employment levels will be down (although only 10% of firms will systematically increase employment). The largest expectations of decline were in the sectors that are most advanced in their use of AI, such as media telecom or high-tech services, but the same is true with respect to the largest share of firms expecting to grow employment with respect to AI, suggesting that the type of AI diffusion is as important as AI itself in determining the direction of labour demand.


What Countries and Companies Can Do When Trade and Cybersecurity Overlap


Since it is not feasible to thoroughly examine the software, firmware, and hardware of every single product, what should countries and companies do to prevent cyber intrusions? One seemingly obvious approach is to exclude from import potentially dangerous products from questionable countries. But this approach requires identifying which products are dangerous and which countries are questionable — a formidable task. And such restrictions can quickly become policies, with implications for international trade and the world economy. Countries and companies need to consider their options. At present, there is no framework for understanding and categorizing the cybersecurity concerns involved in trade. Without a clear understanding, governments may implement policies that result in cyber conflicts, while businesses will struggle to keep up with how cybersecurity concerns and restrictions are evolving.


Conquering the Challenges of Data Preparation for Predictive Maintenance


The first step required for PdM involves data acquisition. Industrial instrumentation is most often associated with measurements of physical quantities such as vibration, infrared heat, electrical current, metal particles in grease, etc. This data typically originates from sensors attached to programmable logic controllers within an industrial control network. Data gateways that bridge control and corporate networks facilitate access to that data via developer-friendly protocols, such as REST and MQTT. It’s also important to consider out-of-band data sources, such as operator logs or weather data, because they can also contain signals that correlate to failure events. Figure 1 below illustrates the interconnections between these types of data assets. Data ingestion is accomplished by processes that continuously collect and store data. These processes can be implemented as custom applications but are generally much easier to develop and manage using a dataflow management tool, such as StreamSets or Apache Nifi.



RIP ICOs: 2019 Will Be the Year of Enterprise Blockchain Tokens


It turns out that the first killer app of the internet was not email. It was the ridiculously simple web page. The first killer app of blockchain is the ridiculously simple token. A token is a mere smart contract that encapsulates the rules governing the exchange of an asset. Once this contract can be generated from an underlying legal contract and shown to execute in line with the legal contract, regulated, legally sound applications of blockchain become possible. This is a big deal. It turns out, all economic activity, micro or macro is built on top of legal contracts. Unfortunately, because of information asymmetries, cost of enforcement, the risk of disputes and uncertainty in legal systems, the cost of contracting in too many transactions can exceed the benefit of the transaction. Smart contracts that execute in line with legal contracts provide evidence of state on-chain and ship with dispute resolution systems can dramatically reduce the costs of contracting and the cost of enforcement, unlocking economic activity across industries and economies.


SaaS Business Models Analyzed


One thing startups and SMBs should keep in mind when working on an idea they want to implement in the form of SaaS is good user feedback and testing before product launch. I know from personal experience that companies in the SaaS space often want to put a product out there before it is really ready to go mainstream and let the market handle it. They do not have patience to wait for critical feedback in the form of beta testing or focus groups ahead of launch. This can turn potential customers or early adopters away and it may be hard bringing them back later once they have the distaste for the product. A free trial mitigates this in many ways and is really the most important thing along with a good user interface companies should be looking at when launching a product. Trials will let users know they can opt out of the product anytime and the best ones for SaaS


IT Operations and Developers – Can’t We All Just Get Along?

Although old habits die hard, IT Ops and Dev need to realize they will benefit from an improved relationship. The survival of your business could very well depend on it. If your company can’t develop and innovate fast enough, your competition will overtake you. For example, ten years ago, who would have thought you could buy a mattress in a box and have it shipped to your door? Even more surprising, who imagined a service to have your teeth straightened—without expensive and time-consuming dental visits? Just take some pictures, send in a mold, and you’ll be sent a new set of aligners monthly to achieve your perfect smile. Your IT Ops team needs to understand and acknowledge the efficiency and productivity gains the Dev team needs through feature releases. Likewise, your Dev team knows they need to partner with IT Ops to ensure they have the resources they need to deliver services faster.


Enterprise Agility in the Norwegian Government

One huge enabler for business alignment has been our new application architecture. We are working hard to move away from a complex architecture, with a lot of dependencies and mainframe solutions to self-developed applications that are Java-based - which all use NAV’s container platform, NAIS (NAV = The Norwegian welfare and labor administration). Microservices are responsible for functionality and data within their area. Events and data become available for other services and for analysis through data streams. These data streams create loose couplings. Our efforts with people, processes and the application architecture now make it possible for us to work in business domains, based on life events. In short, teams within the domains can work decoupled from other teams, gaining development speed and without project overhead. Each of these domains will be led by the business side, and will consist of several functional product teams, with one goal – to deliver value within their field.


naturally occurring organizational & technological shifts make systematic risk management critical to cybersecurity
The overall point here is that these shifts are to be expected over time. However, anticipating shifts -- and building in instrumentation to know about them -- separates the best programs from the merely adequate. So how can we build this level of understanding and future-proofing into our programs? To begin with, there is no shortage of risk models and measurement approaches, systems security engineering capability models (e.g. NIST SP800-160 and ISO/IEC 21827), maturity models, and the like -- but the one thing they all have in common is establishing some mechanism to be able to measure the overall impact to the organization based on specific controls within that system. The lens you pick -- risk, efficiency/cost, capability, etc. -- is up to you, but at a minimum the approach should be able to give you information frequently enough to understand how well specific elements perform in a manner that lets you evaluate your program over time.


Stop the Presses: Don't Rush Tribune Ransomware Attribution

The appearance of Ryuk led some media outlets to rush to connect the attribution dots and suggest that North Korea had attempted to disrupt U.S. newspapers. That's because Ryuk's code shares numerous similarities with Hermes ransomware, as software and hardware IT firm Check Point Software noted in a report released in August. The U.S. government later incorporated that information into its own alert about Ryuk. "Our research led us to connect the nature of Ryuk's campaign and some of its inner-workings to the Hermes ransomware, a malware commonly attributed to the notorious North Korean APT Lazarus Group, which was also used in massive targeted attacks," Check Point says in its August report. But Check Point emphasized that Ryuk's reuse of Hermes code proves nothing. 



Quote for the day:


"Success is not how high you have climbed, but how you make a positive difference to the world." -- Roy T. Bennett


Daily Tech Digest - January 05, 2019

"AI becomes the UI, meaning that the pull-based/request-response model of using apps and services gradually disappears," Agarwal wrote. "Smartphones are still low IQ, because for the most part you have to pick them up, launch them and ask something, and then get a response back. In better-designed apps, however, the app initiates interactions via push notifications. Let's take this a step further so that an app, bot, or a virtual personal assistant using artificial intelligence will know what to do, when, why, where and how. And just do it." ... When it comes to machine learning, Agarwal wrote, the advances we'll see in 2019 are part of a logical evolution of that technology. "The most valuable data comes with context," he said, "what you've done before, what questions you've asked, what other people are doing, what's normal versus odd activity. And the best understanding comes from the depth of data in domain-specific use cases, such as manufacturing, marketing campaigns, e-commerce sites, or IT operations center.



Email trustworthiness: Here’s how to avoid looking like spam

At the same time SPF was being published, a second standard was in the works: DKIM (DomainKeys Identified Mail), which was a cryptographic solution for ensuring that content wasn’t tampered with during message transport. Creating standards around where a message originates and what’s in the message when it’s received versus when it was sent greatly help with establishing the trustworthiness of a given email and the sender that’s sending it. But again, this was not a total and complete solution to the global epidemic of spam. DKIM, along with SPF, became the foundation for DMARC (Domain-based Message Authentication, Reporting and Conformance) in 2011. DMARC allows the sender of an email to create a set of instructions for the receiving domain on what to do if the message fails an SPF or DKIM check. This policy makes it very difficult to spoof brands and deliver fraudulent messages to unsuspecting recipients, or hijack pieces of content to fool filters. 


biometric.jpg
While advances in recognition algorithms are important, improvements are more pressing on the sensor side to provide higher quality input for the algorithms to analyze. In an interview with Bloomberg last month, Sony's sensor head Satoshi Yoshihara indicated that 3D camera sensors with advanced depth sensing are coming in 2019. Sony's depth sensing method relies on measuring the time it takes for invisible laser pulses to travel to the target and back to the handset. ... Despite these advances, legal frameworks for biometric security are still inadequate, with neither apparent interest or desire for policymakers to address the problems. While legal protections exist against forcing suspects to disclose passwords to, or unlock devices for the convenience of law enforcement, biometric authentication can be exploited by anyone with physical hardware access. In 2018, police in Ohio unlocked an iPhone X by forcing a suspect to put their face in front of the phone.


Microsoft Releases Surface Diagnostic Toolkit for Business


The Surface Diagnostic Toolkit for Business has two "modes," a desktop mode and a command-line mode, according to Microsoft's documentation. The desktop mode, which has a graphical user interface, is used to assist end users in help-desk fashion or it can be used to create a "distributable .MSI package" for deployment on Surface devices, where the end users are the ones to carry out the tests. Using the toolkit's command-line mode, IT pros can collect details about a Surface device's system information. They also can gather Surface device health indicators via built-in Best Practice Analyzer capabilities. The toolkit will show information about any missing drivers or firmware updates. It'll also report on the warranty status of a Surface device. The tests carried out by the toolkit's Best Practice Analyzer segment will check the state of a Surface device's BitLocker encryption and Trusted Platform Module, and whether or not Secure Boot protection has been enabled on the device's processor.


HHS Publishes Guide to Cybersecurity Best Practices

HHS Publishes Guide to Cybersecurity Best Practices
The goal of the guidance is to aid healthcare entities - regardless of their current level of cyber sophistication - in bolstering their preparedness to deal with the ever-evolving cyber threat landscape. "I spend a lot of time in healthcare providers that run the gamut in size and security maturity and still the top two questions are either: 'Where do I start?' or 'What do I do next, now that this part is done,'" says former healthcare CIO David Finn, an executive vice president at security consulting firm CynergisTek. "The days of small providers not knowing what to do or large providers thinking they've done all they need to do are over," adds Finn.  HHS notes in a statement that the "Health Industry Cybersecurity Practices: Managing Threats and Protecting Patients" document is the culmination of a two-year effort involving more than 150 cybersecurity and healthcare experts from industry and the government under the Healthcare and Public Health Sector Critical Infrastructure Security and Resilience Public-Private Partnership.


Three Ways Legacy WAFs Fail

At the time, a drop-in web application security filter seemed like a good idea. Sure, it sometimes led to blocking legitimate traffic, but such is life. It provided at least some level of protection at the application layer — a place where compliance regimes were desperate for solutions. Then PCI (Payment Card Industry) regulations got involved, and the whole landscape changed. ... Most people weren’t installing WAFs due to their security value — they just wanted to pass their mandatory PCI certification. It’s fair to say that PCI singlehandedly grew the legacy WAF market from an interesting idea to the behemoth that it is today. And the legacy WAF continues to hang around, an outdated technology propped up by legalese rather than actual utility, providing a false sense of security without doing much to ensure it. If that isn’t enough for you to show your legacy WAF the door, here are three more reasons why legacy WAFs should be replaced.


Poor data-center configuration leads to severe waste problem

Poor data center configuration leads to severe waste problem
The EPA estimates e-waste, disposed electronics, now accounts for 2 percent of all solid waste and 70 percent of toxic waste, thanks to the use of chemicals such as lead, mercury, cadmium and beryllium, as well as hazardous chemicals such as brominated flame retardants. A lot of that is old servers and components. And much of that is due to poor configuration and management, according to a study from server vendor Supermicro. In a survey of people who purchase and administer data-center hardware (pdf), only 59 percent of the 361 respondents consider energy efficiency important when building or leasing a new data center. It's fourth on the priorities list behind security, performance, and connectivity when managing existing data centers. The result? About 58 percent of respondents did not know their data-center Power Usage Effectiveness (PUE). PUE measures how efficiently you cool your systems.


Now is the time to get serious about your cloud strategy

The move away from enterprise data centers has been less aggressive than predicted. It seems that many applications and data sets can’t live anywhere else according to enterprise IT, and while cloud computing is an option, IT views it as a tactical solution. The fact is that cloud computing is no bed of roses. Costs are typically higher than expected, migration is typically costlier and more complex than expected, and operations are much more laborious than expected. However, cloud computing keeps you out the hardware and software procurement and operations weeds, letting you move faster, And, if you’re smart in its usage, cloud computing can make things much cheaper and lower risk. Generally speaking cloud computing makes you more agile and cheaper most of the time. So why is there such a slow movement to move to cloud computing and shut down enterprise data centers?


Reviewing 2018, predicting 2019

processsmall.jpg
Software as we know it has fundamentally been a set of rules, or processes, encoded as algorithms. Of course, over time its complexity has been increasing. APIs enabled modular software development and integration, meaning isolated pieces of software could be combined and/or repurposed. This increased the value of software, but at the cost of also increasing complexity, as it made tracing dependencies and interactions non trivial.  But what happens when we deploy software based on machine learning approaches is different. Rather than encoding a set of rules, we train models on datasets, and release it in the wild. When situations occur that are not sufficiently represented in the training data, results can be unpredictable. Models will have to be re-trained and validated, and software engineering and operations need to evolve to deal with this new reality. Machine learning is also shaping the evolution of hardware. For a long time, hardware architecture has been more or less fixed, with CPUs being their focal point.


Will greater clarity on regulation 'considerably expand' .. crypto market?

Self-regulation will be necessary, because global regulatory bodies move incredibly slowly, especially in such a complex space as the world of digital currencies. “In 2019, the cryptocurrency market is set to radically evolve,” confirms Green. “We can expect considerable expansion of the sector largely due to inflows of institutional investors.” “Major corporations, financial institutions, governments and their agencies, prestigious universities and household-name investing legends are all going to bring their institutional capital and institutional expertise to the crypto market.” “The direction of travel has already been on this path, but there is a growing sense that institutional investors are preparing to move off the sidelines in 2019.” This prediction is optimistic, and if anything is certain in the crypto space, it is that of uncertainty.



Quote for the day:



"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer


Daily Tech Digest - January 01, 2019

6G will achieve terabits-per-second speeds

mobile wireless network
The school has been a major research partner in millimeter 5G development, alongside Nokia, and is now starting work on 6Genesis, its 6G development program. 6G is also sometimes called 5G Long Term Evolution. The University of Oulu has been promised funding for the program that is the equivalent of U.S. $290 million that will be supplied by the Finish government’s Academy of Finland and other sources, including partners. Collaborators in the eight-year program will include Nokia, BusinessOulu (my host, which paid some of my travel expenses to the UArctic Congress conference last week), and other universities. “Millisecond latency [found in 5G] is simply not sufficient,” Pouttu said. It’s “too slow.” One of the problems that will be encountered in 5G overall is related to required scalability, he said. The issue is that the entire network stack is going to be run on non-traditional, software-defined radio. That method inherently introduces network slowdowns. Each orchestration, connection or process decelerates the communication.



The solution to dysfunctional cybersecurity and network teams

The answer appears to be allowing the cybersecurity team complete access to the network. "The percentage of survey participants reporting a high level of trust between teams more than doubles at organizations providing complete visibility to cybersecurity staff," the report mentions. "Similarly, when the cybersecurity team has complete visibility, organizations have a higher level of confidence that they are well equipped to protect the network from future cybersecurity attacks." Besides resolving trust issues and promoting collaboration, there are the following additional benefits: Both teams have greater confidence that team members understand what's happening on the network; Each team's activity will complement, not overlap or interfere, with the other team's efforts; and Respondents (55%) believe integrating the teams will allow a faster, more-efficient response to security events.


Threat of the month: Android master key vulnerability
Information pilfered includes “emails, retainer agreements, non-disclosure agreements, settlements, litigation strategies, liability analysis, defence formations, collection of expert witness testimonies, testimonies, communications with government officials in countries all over the world, voice mails, dealings with the FBI, USDOJ, DOD, and more, confidential communications, and so much more,” the group wrote, explaining that the law firm paid the initial ransom demand but then breached the terms of agreement by reporting to law enforcement. The group, which threatened to “bury” the company unless a second ransom demand was paid in bitcoin, said it would escalate the release of the law firm’s internal files, noting “each time a Layer is opened, a new wave of liability will fall upon you.” The hackers referred to Hiscox as one “of the biggest insurers on the planet,” referencing the World Trade Center, following up with a tweet promising to provide “many answers about 9.11 conspiracies through our 18.000 secret documents leak.”



Machine Learning in Excel

This article is written for you who is curious of the mathematics behind neural networks, NN. It might also be useful if you are trying to develop your own NN. It is a cell by cell walk through of a three layer NN with two neurons in each layer. Excel is used for the implementation. ... We are both curious about Machine Learning and Neural Networks. There are several frameworks and free api:s in this area and it might be smarter to use them than inventing something that is already there. But on the other hand, it does not hurt to know how machine learning works in depth. And we also think it is a lot more fun to dig down into things, don't we? My journey into machine learning have perhaps just started. And I started by googling, reading a lot of great stuff on the internet. I saw a few good YouTube videos also. But I found it hard to gain enough knowledge to start coding my own AI. Finally I found this article, that suited me, and which the rest of this text is based on.


An Introduction to CSS Shapes

cssshapes_featured
Until the introduction of CSS Shapes, it was nearly impossible to design a magazine-esque layout with free flowing text for the web. On the contrary, web design layouts have traditionally been shaped with grids, boxes, and straight lines. CSS Shapes allow us to define geometric shapes that text can flow around. These shapes can be circles, ellipses, simple or complex polygons, and even images and gradients. A few practical design applications of Shapes might be displaying circular text around a circular avatar, displaying text over the simple part of a full-width background image, and displaying text flowing around drop caps in an article. Now that CSS Shapes have gained widespread support across modern browsers, it’s worth taking a look into the flexibility and functionality they provide to see if they might make sense in your next design project. The current implementation of CSS Shapes is CSS Shapes Module Level 1, which mostly revolves around the shape-outside property. shape-outside defines a shape that text can flow around.


Data Ingestion Best Practices

In the good old days, when data was small and resided in a few-dozen tables at most, data ingestion could be performed manually. A human being defined a global schema and then assigned a programmer to each local data source to understand how it should be mapped into the global schema. Individual programmers wrote mapping and cleansing routines in their favorite scripting languages and then ran them accordingly. Today, data has gotten too large, both in size and variety, to be curated manually. You need to develop tools that automate the ingestion process wherever possible. For example, rather than manually defining a table’s metadata, e.g., its schema or rules about minimum and maximum valid values, a user should be able to define this information in a spreadsheet, which is then read by a tool that enforces the specified metadata.


Experian exec says biometrics won’t save you from mobile hacks

mobile security / unlocked data connections
"There are a number of ways every security system, not limited to biometrics, can be duped. And most of it, as we have found in post breach research, is due to some form of human error. Biometrics themselves may be very strong, just like malware protection or device security, but the hackers look for a [human] weakness. For example, biometrics may have different levels of sensitivity, and if the person setting up the biometrics doesn't turn up the sensitivity high enough, more people are easily able to get in. If you turn it up too high, you have too many people rejected. "Point I'm making is 80% to 85% of all breaches we service have a root cause in employees not doing the right thing, making a mistake, doing stupid stuff. It's not necessarily that the hackers are so smart that they have all these different attack vectors that are so much better than the company's security; they're looking for the weakest link, and generally employees are the weakest link."


Artificial Intelligence is an engineering problem, not magic!

From an engineering point of view with no serious mathematics background, its very encouraging to see how accessible this field can be for folk like myself, dealing with applied technology solutions on a daily basis. This is to be the first of a series of articles I intend to write on the subject, a brief introduction. The aim is to build up knowledge of different AI areas and give just enough background to enable you to understand how things work, and how to implement them on a practical level. If you have a reasonable grasp of the fundamentals, there is no reason why you cannot get to a position quickly where you will: know how to approach different engineering problems with AI solutions
identify which category of AI will be most suitable for a given problem; and know what libraries to use, and what you need to chain together to build out a solid professional solution. Before we get stuck in, lets draw a line in the sand regarding AI .... the type of AI that we have nowadays, that does we must admit some wonderful (yet limited) things, is referred to as 'Narrow AI'.


Cloud computing gets a second look from health execs

While it’s certainly possible to manage data from disparate sources with on premise solutions, the services developed by cloud vendors—including the liberal use of APIs—are already available for this purpose. “This is not a nice to have anymore. It is very quickly becoming an institutional imperative,” says George Gardner-Serra, partner at Clarity Insights, a consulting firm specializing in data analytics. “The leading organizations are moving very quickly in that respect.” Vendors, eager to ink contracts in the healthcare sector, are working to address providers’ needs. First, the major cloud vendors—such as Amazon Web Services, Microsoft Azure, and Google Cloud—have invested heavily in developing solutions that address security and privacy issues. “They are all willing to sign business associate agreements and maintain HIPAA compliant structures,” notes Jeff Becker, a senior analyst at Forrester Research.


Debunking Low-Code Myths to Empower App Modernization

Using a low-code platform, citizen developers can develop very simple applications that can offer basic functionalities. Power builders can build applications with more functionalities than that offered by citizen developers. Professional developers, on the other hand, can deliver complex applications with multiple functionalities and automation processes. A low-code platform lets a professional developer build application swiftly by reducing the amount of manual coding required. In short, a low-code platform enhances the capabilities of all types of developers by letting them do more than what they are capable of in app development. ... Low-Code and No-Code terminology itself is misleading, as the distinction isn’t about whether people need to code or not. The distinction is more about the types of people using these platforms to build applications.” This sums up the required differentiation between low-code and no code platforms.



Quote for the day:



"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg