September 25, 2016

By starting with a contract or test case that is well understood to incrementally test a feature requirement, you ensure that as a small iterative unit of work completes, it meets that contract in such a way that it is releasable, deployable software. Exploratory testing tools for new feature development come to play as do coverage tools that send data showing anomalies between releases back to the quality process. Coveralls.io is a great tool that’s easy to configure and has wonderful visualizations for the most popular languages, while Jenkins has a highly customizable dashboard. ... Technology can’t solve all problems, however, so developers and testers will need to change some of their workflows to master CT. These concepts are closely linked to the Agile and DevOps practices you are probably already using, so adapting testing in this way should not be a huge shift.


Google Allo: Don't use it, says Edward Snowden

Allo does support end-to-end encryption, which should make it difficult for anyone but recipient and sender to view the contents of messages; however, Google was criticized by Snowden and other privacy advocates for setting it as off by default. Allo relies on the encryption protocol used by Signal, which Snowden has vouched for as a private messaging app, but in Allo it is only active when users are in Incognito Mode. "We've given users transparency and control over their data in Google Allo. And our approach is simple -- your chat history is saved for you until you choose to delete it. You can delete single messages or entire conversations in Allo," Google said in a statement toTechCrunch.


Investing in AI offers more rewards than risks

While some may argue it’s impossible to predict whether the risks of AI applications to business are greater than the rewards (or vice versa), analysts predict that by 2020, 5 percent of all economic transactions will be handled by autonomous software agents. The future of AI depends on companies willing to take the plunge and invest, no matter the challenge, to research the technology and fund its continued development. Some are even doing it by accident, like the company that paid a programmer more than half a million dollars over six years, only to learn he automated his own job. Many of the AI advancements are coming from the military. The U.S. government alone has requested $4.6 billion in drone funding for next year, as automated drones are set to replace the current manned drones used in the field.


Crowdsourcing Data Governance

This variation of context is why the right operating model set up is so important for any data governance initiative, especially the ones that are just getting started. A successful data governance initiative will bring change, and so time becomes yet another dimension for the context. I’ve seen it happen many times: organizations launch with a best-in-class operating model to drive their stewardship. They gain adoption, and the resulting change makes the original operating model obsolete, or rather stretches it to the limit. This is why I am absolutely convinced that a data governance platform that aims to be successful needs a capability for operating model configuration: your roles, responsibilities, workflows, dashboards, views, use cases, and more.


Bossie Awards 2016: The best open source application development tools

or years and years, we’ve been building applications that collect data from the users and serve it back to them. We’re finally starting to do something with that data. Along with the best open source tools for building web apps, native apps, native mobile apps, and robotics and IoT apps, this year’s Bossie winners in application development include top projects for data analysis, statistical computing, machine learning, and deep learning. After all, if our applications can be reactive, responsive, and even “ambitious,” they can also be intelligent.


Is this the age of Big OLAP?

What has dogged OLAP, though, is its scalability. Most OLAP servers run on single, albeit beefy, servers, which limits the parallelism that can be achieved and therefore imposes de facto limits on data volumes. Customers who hit these scalability ceilings may contemplate using Big Data technologies, like Hadoop and Spark, but those tend not to employ the dimensional paradigm to which OLAP users are accustomed. What to do? Well, a few vendors have decided to take Hadoop and Spark, and leverage them as platforms on which big OLAP cubes can be run and built. ... Their approach has been to let people in those enterprises work in the OLAP environments they are comfortable with and, at the same time, make use of their Hadoop clusters.


Deep Learning in a Nutshell: Reinforcement Learning

Reinforcement learning is about positive and negative rewards (punishment or pain) and learning to choose the actions which yield the best cumulative reward. To find these actions, it’s useful to first think about the most valuable states in our current environment. For example, on a racetrack the finish line is the most valuable, that is the state which is most rewarding, and the states which are on the racetrack are more valuable than states that are off-track. Once we have determined which states are valuable we can assign “rewards” to various states. For example, negative rewards for all states where the car’s position is off-track; a positive reward for completing a lap; a positive reward when the car beats its current best lap time; and so on.


NYDFS Proposed Cybersecurity Regulation for Financial Services Companies

The goal of the Proposed Regulation is to secure “Nonpublic Information” from misuse, disruption and unauthorized access, and as noted above, such information is defined broadly. It includes not only competitively sensitive information and intellectual property, but also numerous categories of information that a Covered Entity receives from or about consumers, including information considered nonpublic personal information under the GLBA Privacy Rule. ...” When something goes wrong, the Covered Entity must report it to the Superintendent. Specifically, any attempt or attack “that has a reasonable likelihood of materially affecting the normal operation of the Covered Entity or that affects Nonpublic Information” must be reported to the Superintendent within 72 hours after the Covered Entity becomes aware of the event.


Big Data Processing with Apache Spark - Part 5: Spark ML Data Pipelines

Machine learning pipelines are used for the creation, tuning, and inspection of machine learning workflow programs. ML pipelines help us focus more on the big data requirements and machine learning tasks in our projects instead of spending time and effort on the infrastructure and distributed computing areas. They also help us with the exploratory stages of machine learning problems where we need to develop iterations of features and model combinations. Machine Learning (ML) workflows often involve a sequence of processing and learning stages. Machine learning data pipeline is specified as a sequence of stages where each stage is either a Transformer or an Estimator component. These stages are executed in order, and the input data is transformed as it passes through each stage in the pipeline.


How fintech startups can disrupt the financial services industry

Successful fintech startups will embrace “co-opetition” and find ways to engage with the existing ecosystem of established players. E.g. PayPal partners with Wells Fargo for merchant acquisition. Some business lending platforms enable banks to participate as credit providers on their platforms. Conversely, some banks partner with P2P lending platforms to provide credit to those borrowers who would ... Fintech startups are flying under the regulatory radar so far. However that may change in the near future. Regulatory tolerance for lapses on issues such as know your customer, compliance, and credit-related disparate impact will be low. Experience of microfinance industry in many developing countries the past is a good indicator of the high impact of regulation on an unregulated industry.



Quote for the day:


"It is better to be defeated standing for a high principle than to run by committing subterfuge." -- Grover Cleveland


September 24, 2016

Implementing DevOps starts with rethinking deeply-rooted processes

DevOps expands enterprise agile from product management and development all the way to IT operations. We’ve done enterprise agile. We now create apps in a way that focuses on the client and maximizes throughput and quality. But that hits a barrier when you get to traditional IT operations. The changes that will be taking place over next five years create the ability to take enterprise agile into IT operations. It means improved platforms, improved automation, improved collaboration across development and ops. It means a constant flow of value into production. ... Rather than throwing large releases over the wall to operations, how can we bring teams together to identify tooling, processes, and practices that can be deployed to automate provisioning and production deployment fully? This significantly improves time to market by increasing throughput and quality.


I got 99 data stores and integrating them ain't fun

The concept has been around for a while and is used by solutions like Oracle Big Data. Its biggest issues revolve around having to develop and/or rely on custom solutions for communication and data modeling, making it hard to scale beyond point-to-point integration, Could these issues be addressed? Data integration relies on mappings between a mediated schema and schemata of original sources, and transforming queries to match original sources schema. Mediated schemata don't have to be developed from scratch -- they can be readily reused from a pool of curated Linked Data vocabularies. Vocabularies can be mixed/matched and extended/modified to suit custom needs, balancing reuse and adaptability.


The next target for phishing and fraud: ChatOps

Like many cloud platforms, chat tools allow external organizations to leverage internal APIs to extend functionality, ranging from scheduling assistants to travel booking tools to various engineering and product management systems. Overall, this extensibility represents a core strength of these systems. From a security perspective, however, they can represent data exfiltration opportunities that must be addressed. First, not every third party company is a good steward of the data they have access to; corporate policies for vendor review and acceptable use should apply to chat programs in the same way that they do for any system. As with the GSA example, relying on users to understand the technological limitations and risks around connecting technologies is not a strong strategy.


Why Fintech has made finance courses obsolete

Today, it’s a lot more complicated because we don’t know what will be the jobs of the next 10 or 20 years. So it’s a lot harder to be passive, and I think you have to be a lot more active. As a piece of advice, if I were 20, there are two things I would do. The first thing, from an education standpoint, you have to learn something, but you have to learn something you like. Just because you want to learn fintech doesn’t mean you have to code. So you really have to learn what you like on the education standpoint, and the second thing, I think it’s about the mindset. It means that to avoid being passive but be very active, for me the best quality to have today is being able to think like an entrepreneur. You don’t necessarily need to be an entrepreneur, you may work for a big company, but you have to think like an entrepreneur.


Computers could develop consciousness and may need 'human' rights, says Oxford professor

Advances in artificial intelligence could lead to computers and smartphones developing consciousness and they may need to be given ‘human’ rights, an expert has claimed. Marcus du Sautoy, who took over from Richard Dawkins as Professor for the Public Understanding of Science at Oxford University in 2008 said it was now possible to measure consciousness and, in the future, technology could be deemed to be ‘alive.’ Most scientists believe that computers are close to getting to a point where they begin to develop their own intelligence and no longer need to be programmed, an event dubbed the ‘technological singularity.’


Why CMOs need to care about security like CIOs

While some marketers will view consumer grade file and sync platforms such as Dropbox or WeTransfer as a swift business panacea, the risk that these platforms open up for data breaches with uncontrolled sharing are high. Today, the ‘data perimeter’ – the boundary that safeguards an organization’s sensitive data – has shifted considerably. This is a result of a more mobilized workforce and greater collaboration with external partners. In the past, when most workers only accessed company information from within the four walls of the business and data was saved on shared drives from PCs located in the enterprise, the perimeter was the firewall. Since the advent of cloud computing, this has changed. In today’s connected world, the data perimeter needs to reside within individual documents, in addition to within the IT infrastructure.


Three Industrial Internet of Things (IIoT) myths that need busting

Unlike consumer markets where standardisation - formal or by market dominance - is key to success, IIoT standardisation won’t be a concern for decades. Sure, there are multiple emerging standardisation initiatives in IIoT and yes, it’s not yet possible to know which will grow or be marginalised. But it doesn’t matter. Unlike consumer markets where new standards for say NFC chips in smartphones can roll out and get near full market presence in the few years it takes for people to replace their phones, industries are run on equipment that is anything from years to several decades old. This equipment has been provided by tens, or hundreds of different suppliers.


It’s time for drivers to learn new skills

Experts predict that 6% of all jobs in the US will be gone by 2021 due to automation. The former CEO of McDonald’s sees replacing the whole company’s restaurant workers as a simple question of economics. From software and legal help to sports reports and parcel delivery, there’s few jobs that won’t see some sort of reduction in the world force. But the world’s drivers could be most at risk. Despite continued fear from the public, the money men at taxi, logistics, and delivery companies will have no such fear at deploying autonomous vehicles on the road. No wages, greater fuel efficiency, no worries about shift work or rest stops. It might be cold, but in terms of business sense it’s hard logic to argue with.


How Li-Fi Will Disrupt Data Centres ‘Very Ugly Radio Environment’

“Most of that is using fiber optic cables and not free space transmission (which Li-Fi seems to be). The total capacity in a data centre far exceeds anything that could be done by a shared system (same is true of radio versus use of copper cables).” He continued to say that there is also a need for shared management communications within the data centre (things like DCIM where someone wants “out of band” communications with the hardware, for example). Giving an example of recently carried out research around the use of Li-Fi in the data centre, Christy mentioned Microsoft’s innovative work where the tech giant complemented the “wired” network with a broadcast network (in the data centre) that could be implemented either with radio or which light transmission bounced off the ceiling.


Industrial IoT is inching toward a consensus on security

Immature security is the biggest thing delaying adoption of industrial IoT, said Jesus Molina, co-chair of IIC’s security working group, in an interview. Components commonly used in enterprise IT security, like identity and root of trust, don't really exist yet in IoT, he said. There are several components to making anything in IoT trustworthy, the framework says: safety, reliability, resilience, security and privacy. These issues come up because industrial IoT connects so many components, including things like sensors and actuators at the edge of an enterprise, that didn’t exist or weren’t connected to the internet up until now. Those edge connections can open up dangerous vulnerabilities, because they’re often designed to carry some of the most sensitive information in an organization.



Quote for the day:


"He who rejects change is the architect of decay." -- Harold Wilson


September 23, 2016

Infrastructure as code: What does it mean and why does it matter?

Code forms the backbone of this approach, giving rise to the term infrastructure as code (IaC), which, in simple terms, means code that helps in provisioning systems out onto an IT platform.  Today, IaC has grown to be full-function and highly flexible, and there are several variants to consider, including declarative, imperative and intelligent IaC. The declarative approach creates a required state and adapts the target infrastructure to meet those conditions, while the imperative version creates a target environment based on hard definitions set out within the script. The intelligent state, meanwhile, takes into account other pre-existing workloads within the target environment, and reports back to a system administrator about any problems it encounters.


Open source technology gains steam in data center, but challenges loom

When deploying or running open source technology, the lack of professional support can leave IT scrambling. Even after combing through search engine results and discussion boards, admins still might not have an answer for an urgent question. Professional support is lacking with open source tools, and although some vendors offer support services, they often comes at a cost. When a primary driver to switch to open source is the financial aspect, spending money on the necessary support can create a dilemma. Some larger companies have the resources -- both from a financial and staffing standpoint -- to support open source hardware and software in the data center, but smaller organizations often struggle to do so.


Can Armies Of Interns Close The Cybersecurity Skills Gap?

Since cybersecurity is a relatively new field, professionals in the sector tend to pick up expertise on the job. It's only more recently that universities have started seriously ramping up programs. But BullGuard finds that's been happening internationally, not just in the U.S., so it's making moves to tap into those talent pipelines pretty much as soon as they're constructed. With its new Romania-based internship program, Lipman explains, "We took computer science students with cyberexperience in their college studies, and put them into our more innovative projects over the summer. It’s been a real win-win. We get access to new blood [and] fresh thinking, the interns get valuable real-world experience, and we build a relationship with the university." Establishing this ability to "hire straight out of college,"


CQRS for Enterprise Web Development: What's in it for Business?

The CQRS pattern is widely acclaimed by advocates of Domain Driven Design. The approach emphasizes solving business problems in the first place during the implementation of an application. It centers on thorough elaboration of a business domain and the context within which it will function. The possibility to focus on the business first rather than on the technical issues and work out all the nuances pertinent to a specific domain is achieved through the use of the Ubiquitous language – a single language understood by an implementation team, business analysts, domain experts and other parties involved. The language helps to share the effort among all team members – business and technical – who define and agree on the use of common business objects to describe the solution’s domain model and a certain context within such a model.


The changing data protection paradigm

The amount of new data available is staggering. As the Harvard Business Review aptly put it, "More data cross the internet every second than were stored in the entire internet just 20 years ago." This data has varying degrees of value and sensitivity, and resides on a variety of systems, including endpoints, removable media, local servers, cloud servers, and cloud-based services like Box and Dropbox. This growth and spread of data has quickly exceeded the ability of most companies to keep track of it, let alone protect it. This massive influx of data, spread out among various locations, has naturally brought with it increasing security exposures, leading to an almost daily data breach crisis.


NIST launches self-assessment tool for cybersecurity

It's designed to walk organizations through the process of figuring out "how to integrate cybersecurity risk management ... into larger enterprise business practices and processes," Matthew Barrett explained to FedScoop. Barrett is the program manager for the NIST Cybersecurity Framework — a document that catalogues the five areas of cybersecurity every company needs to know: identify, protect, detect, respond and recover. ... "The self-assessment criteria are basic enough that they could apply to organizations of any size," said Barrett. But critics aren't so sure. Larry Clinton, founder and CEO of the alliance, called the excellence builder "a pretty sophisticated tool," but added that meant it was really most useful to larger enterprises.


Ideas for Filling the Cybersecurity Skills Gap

Studies over the years show the struggle in building an IT security staff. For example, a GAO survey earlier this year of federal agencies' CISOs reveals their difficulties in recruiting, hiring and retaining security personnel. Wilshusen says the problem of maintaining a sufficient security staff makes it more challenging for agencies to effectively carry out their responsibilities. In building the federal government's cybersecurity workforce, Pritzker suggests the commission consider recommending a centralized system to recruit, train and place federal cybersecurity personnel as well as creating specialized pay scales to compete with the private sector. "We need to rethink recruitment with bold ideas like debt forgiveness for graduates of certified programs, tuition-free community college in return for federal service and cybersecurity apprenticeships within civilian agencies," the Commerce secretary says.


This Is how you stop ignoring the employee voice

In case you forgot, your employees are human. They are all living, breathing, feeling beings who deserve a bit of human interaction. Take the time to meet regularly and face-to-face with your employees. This not only gives you and your team members a chance to catch up on their performance, but also allows employees to share opinions or issues they are facing. Airing those grievances face-to-face lets employees see their manager’s reaction, as well as have an immediate discussion about what can and will be done. Now you might be thinking, “But an email thread is sooo much easier!” It’s also lazier. And might be negatively affecting employee engagement.


Serverless Architectures: The Evolution of Cloud Computing

Serverless architectures are a natural extension of microservices. Similar to microservices, serverless architecture applications are broken down into specific core components. While microservices may group similar functionality into one service, serverless applications delineate functionality into finer grained components. Custom code is developed and executed as isolated, autonomous, granular functions that run in a stateless compute service. ... For a serverless architecture, the “User” service would be separated into more granular functions. In Figure 2, each API endpoint corresponds to a specific function and file. When a “create user” request is initiated by the client, the entire codebase of the “User” service does not have to run; instead only create_user.js will execute.


Why Red Hat is misunderstood amid public cloud worries

Any customer that bets on a cloud stack and uses proprietary APIs is going to have some form of lock-in. That's why OpenStack is such a popular movement. Now let's take those nuances back to Red Hat. "What we're seeing is that large customers see value in running everywhere," said Whitehurst. "These customers want a standard operating environment and want to take Linux with them as they go cloud." Worrywarts about Red Hat will argue that a move to the public cloud means that AWS will get the Linux business. Not necessarily. "As more goes to the public cloud the more relevant we get," Whitehurst argued. "If you are moving to Amazon you have to architect it so you're not locked in. Large enterprises feel burned out about being locked in."



Quote for the day:


"Not all of us can do great things. But we can do small things with great love." -- Mother Teresa


September 22, 2016

Over 6,000 vulnerabilities went unassigned by MITRE's CVE project in 2015

Why does MITRE not have assignments for vulnerabilities identified via other sources? Why haven't the CNAs shared their own disclosures with MITRE so that CVE can reflect the information, instead of leaving entries in RESERVED status, which shows nothing? Why aren’t CNAs assigning IDs to all of the vulnerabilities they disclosed, since some of the unassigned vulnerabilities are in their products? VulnDB shows 14,914 vulnerabilities disclosed in 2015. Within that set, only 8,558 vulnerabilities have CVE-IDs assigned to them. That leaves 6,356 vulnerabilities with no CVE-ID, and likely no representation in a majority of security products. ... While these numbers are bad, what's worse is that the industry has already felt the impact of an attack against a vulnerability that wasn't assigned a CVE-ID.


EastWest Institute Launches Cybersecurity Guide for Technology Buyers

“As cybersecurity vulnerabilities continue to increase, every corporation and government needs guidance to better understand the impact of their purchasing decisions on the security and integrity of their enterprises,” said Steve Nunn, CEO and President, The Open Group. “Every organization should be questioning their suppliers concerning risk management, product development, cyber and supply chain security and best practices. This Buyers Guide supports conformance with international standards and, where appropriate, process-based certification programs that help answer some of these critical questions.”


Lockdown! Harden Windows 10 for maximum security

Windows 10 also introduces Device Guard, technology that flips traditional antivirus on its head. Device Guard locks down Windows 10 devices, relying on whitelists to let only trusted applications be installed. Programs aren’t allowed to run unless they are determined safe by checking the file’s cryptographic signature, which ensures all unsigned applications and malware cannot execute. Device Guard relies on Microsoft’s own Hyper-V virtualization technology to store its whitelists in a shielded virtual machine that system administrators can’t access or tamper with. To take advantage of Device Guard, machines must run Windows 10 Enterprise or Education and support TPM, hardware CPU virtualization, and I/O virtualization. Device Guard relies on Windows hardening such as Secure Boot.


What do IT administrator skills mean now?

The role of the IT administrator will definitely need to change as data centers hybridize across multiple types of private and public clouds, stacks of infrastructure converge and hyper-converge, and systems management develops sentience. Of course, change is inevitable. But how can old-school IT administrators stay current and continue providing mastery-level value to their organizations? I'd recommend paying attention to current trends and emerging capabilities. Become an expert in how the organization can best use those trends. ... The future of IT is about creating higher-level value individually while leveraging core expertise widely -- developing the deepest insights, but sharing it as widely as needed to get an optimized return on the IT investment that businesses make.


IBM says: ‘Swift is now ready for the enterprise’

With Swift on the Cloud, enterprises will benefit from faster back-end API performance, safer and more reliable transaction and integration support, and the ability to re-purpose Swift developer skills on the client and server-side. This integration delivers tangible benefits to enterprise IT.City Furniture was building an app to handle clearance furniture. They had intended building their front end apps in Swift, but were able to work with early versions of the tools IBM introduced today to build the back end code in the same language. “They were able to build that in an incredibly short time, a few weeks,” he said. City Furniture is a perfect example of the kind of small, nimble development teams that will underpin the future of enterprise IT. “They had one developer and we helped them a bit. That one developer was also able to contribute to the project


9 Ways To Ensure Cloud Security

Whether you’ve migrated some or all of your infrastructure to the cloud, or are still considering the move, you should be thinking about security. Too often, organizations assume a certain level of protection from a cloud service provider and don’t take steps to ensure applications and data are just as safe as those housed in the data center. The sheer range of cloud technology has generated an array of new security challenges. From reconciling security policies across hybrid environments to keeping a wary eye on cloud co-tenants, there is no shortage of concerns. An increasingly complex attack landscape only complicates matters and requires security systems that are vigilant and able to adapt. Here are nine tips to consider before, during, and after a cloud migration to stay ahead of the curve when evaluating security solutions for your cloud service.


Cyber Security Threat Detection – The Case for Automation

The good news is that advances in threat detection technology have significantly improved the enterprise’s ability to detect and stop these threats and prevent extensive damage. The challenge, however, is that many of these technologies demand an army of human security analysts to interpret threat indicators and determine the appropriate course of action, including elimination and clean up. With hundreds, if not thousands, of varying levels of threat flags per day, this task is like holding back the tide; it is nearly impossible for security teams to keep up with the flow of information and still perform other ongoing responsibilities in prevention and analysis. Not surprisingly given their frequency, many of these alerts are often ignored.


Taking Risks To Manage Risk: The Life Of The Modern IT Security Executive

Risk isn’t something that many IT security professionals are comfortable with. After all, they’re often employed to reduce the risk of attacks on corporate IT. ... Doing things differently often comes with the risk of failure, which can have negative consequences to a company’s IT security. But the IT security space is dynamic; new technologies, solutions and strategies come out regularly and CISOs need to keep pace with these developments. “The biggest risk at the moment is doing nothing — you’re at risk of becoming irrelevant,” CSIRO CISO and lead architect Angus Vickery said at SINET61. “You have to do something to ensure you’re continually relevant because the horse will bolt without you anyway. “… Modern CISOs need to have an open mind.”


Security framework released for industrial Internet of Things

The security framework goes along with reference architecture, connectivity and other guides previously published by the consortium. This document separates security evaluation into endpoint, communications, monitoring and configuration building blocks, each with implementation best practices. It also breaks the industrial space down into three roles: Component builders (who build hardware and software), system builders (better known to readers here a system integrators) and operational users. To ensure end-to-end security, the consortium notes industrial users must assess the level of trustworthiness of a complete system. As for the future, the concluding note in the framework points out that as the sheer volume of data required for managing devices increases, there’s a point where centralized security management ceases to be effective and efficient.


Five Strategies For Creating a Culture of Data Security

When data protection is prioritized and done well, it provides more disciplined operations, increased customer and stakeholder trust, and minimized risk. One of the best ways to protect company information is to create a corporate culture that views information security as a shared responsibility among all employees. This can be done by implementing regular and comprehensive training programs for all employees on the right way to manage, store and destroy physical and digital data. ... Experts suggest that employees may forget 50 percent of training information within one hour of a presentation, 70 percent within 24 hours and an average of 90 percent within a week. When you consider this, it is clear that training once a year or on an ad-hoc basis is insufficient to ensure valuable customer, employee and business data is being protected.




Quote for the day:


"Relative to all the other risks companies face, the cyber risks often aren't as big a deal as we think. It may be bad for you if you are the victim, but it doesn't change the behavior or strategy of a company." -- Sasha Romanosky


September 21, 2016

Five Social Engineering Scams Employees Still Fall For

“Most people are not going to look really closely to know where that email came from, and they click on it and their machine may be taken over by somebody, or infected,” says Ronald Nutter, online security expert and author of The Hackers Are Coming, How to Safely Surf the Internet. “Especially when you’re exchanging files with subcontractors or partners on a project, you really should be using a secure file transfer system so you know where the file came from and that it’s been vetted.” He also cautions recipients to be wary of any file that asks the user to enable macros, which can lead to a system takeover.


How flexible should your infosec model be?

How often to adopt infosec policy changes is a conundrum. Companies need to come up with a way to remain flexible, to ensure that their policies and procedures reflect the current threat landscape, yet they can't hand down so many new rules and restrictions that they frustrate users and inadvertently compel them to consider bypassing corporate rules, explains Kelley Mak, an analyst at Forrester Research. At the same time, companies have to strike a balance between using firefighting tactics to address the most current threats and treating information security policy as a holistic strategy, Mak says. "It's not as simple as taking the data and making a new policy, because you have to make sure information workers aren't upset," he says. "The more restrictions you put in place, the more likely someone is to go around it."


Cybercrime Inc: How hacking gangs are modeling themselves on big business

Like the legitimate software market, cybercrime is now a huge economy in its own right, with people with a range of skillsets working together towards one goal: making money with illicit hacking schemes, malware, ransomware, and more. It's essentially an extension of 'real world' crime into cyberspace, and it's come a long way in recent years as groups have become bigger, more specialized, and more professional. "There's been a substantial amount of improvement and innovation in the way attackers go after networks and, as cybercrime has professionalized, you've seen individuals develop a particular set of skills which fit into a broader network," says Gleicher, now head of cybersecurity strategy at Illumio.


Picking up the pace: The intersection of strategy and agility

Organizational agility, not to be confused with the Agile methodology, is the ability to quickly identify and execute initiatives for opportunities and risks that align with overall strategy. This means that organizations have not only to stay aware of changes in their business environments, but also to be flexible enough to change direction and implement new initiatives quickly, both in order to avoid risks and to achieve competitive advantages. APQC and Strategic and Competitive Intelligence Professionals (SCIP) conducted a survey to look at organizational agility and understand what role strategy has in helping organizations be more agile. To that end, the survey investigated organizations’ agility, strategic planning, information assessment, and implementation practices.


Roundtable: What Experts Are Doing to Protect Against Ransomware

What’s different is that your user population needs to know what to do if a ransom message appears on their screen. Do they power off, disconnect from the network or do both? Your user community has to know exactly what to do. By the way, the right answer is to disconnect from the network and not power off—rely instead on whatever mechanism you have to trigger an incident response. Do not power off. So the users have to know that. Assuming that you have the basic hygiene—the incident response plans, the remediation, the patching, the hardening, the configurations—in place, then the only other additional consideration is that if you don’t have a fast, automatic way of detecting and responding to zero-day malware—either at the network level or at the end point level—you need to get one.


The Internet of Things, cyber-security and the role of the CIO

Basically we are inexperienced in creating large platforms with security in mind. This inexperience in deploying mass networks in a secure way could create a recipe for major breaches and security issues. The IoT is a relatively greenfield area in IT. It should offer the chance to design and architect solutions with security integrated right from the start, rather than an additional feature further down the road. Whilst CIOs need to be mindful of this issue for future planning, there is also the opportunity to make sure vendors are building this security into any IT expenditure that the organisation plans to make. Existing security controls may well be able to address these new concerns but they need to be implemented in an agile and effective way to enable them to adapt to the new attack vectors.


Navigating The Muddy Waters Of Enterprise Infosec

Many companies today hope to avoid similar high-profile wakeup calls. After years of news about disastrous breaches, information security has finally gotten the attention of upper management. Two-thirds of 287 U.S. respondents to a survey conducted by CSO, CIO and Computerworld said that senior business executives at their organizations are focusing more attention on infosec than they were in the past. And most of the respondents said they expect that focus to continue. Yet IT leaders still face challenges when it comes to aligning security goals with the needs of business, including justifying costs, defining risks, and clarifying roles and responsibilities.


ArtificiaI intelligence, APIs and the transformation of computer science

Like yesterday’s code libraries, you could try to build A.I. platforms yourself -- if you had a few years and a dozen data scientists to throw at the problem. Or you can access A.I. engines like IBM’s Watson or Google’s TensorFlow “as-a-service,” taking advantage of the planet’s most advanced, fundamental CS work via an API call. When one looks at the world of software in this way, the choice for most companies today is straightforward: spend years of effort and millions of dollars in expense duplicating extremely important -- but ultimately commodity, especially once it’s open-sourced -- computer science work, or instead focus on leveraging that work to develop and improve their own products and intellectual property. For most businesses, the choice is simple.


In a world of free operating systems, can Windows 10 survive?

We can all pretty much agree that Windows has some staying power. That said, when I asked our resident Windows soothsayer Ed Bott about actual numbers of users, he told me, "Given that PC sales are flat or down in recent years and are probably close to the replacement rate, it's likely that the very large Windows installed base is shrinking slowly." The operative word here isn't "shrinking," it's "slowly." There are millions of users out there who have good reason to stick with Windows. Many of them will continue using it because the learning curve for a different operating system is either too much work, or just simply unnecessary. Others will stay with it because Chromebooks, tablets, and other "appliance-like" machines just don't have enough power and flexibility.


How HR and IT departments can join forces to bolster security strategies

Working with IT, HR should establish processes to manage access rights to sensitive data – ensuring that appropriate controls are in place – and preventing employees from accessing data that they don’t need. HR can also support IT in identifying gaps in terms of departments or individuals, like contractors or temporary staff, with permissions that have not been withdrawn or privileges that may need to be re-defined. They can implement processes and technology for managing access rights and to ensure that these are regularly audited to close any security gaps.  Full co-operation between HR and IT is essential in projects of strategic importance such as IAM (Identity Access Management) deployments. This is a common pitfall, but without internal co-operation there can be misunderstandings, or at worst, projects can unravel entirely.



Quote for the day:


"Negativity will derail you from pursuing success, and like attracts like." -- Kathleen Elkins