June 07, 2015

Video: Parallel Algorithms Reconsidered
In this video, Peter Sanders from Karlsruhe Institute of Technology presents:Parallel Algorithms Reconsidered. Parallel algorithms have been a subject of intensive algorithmic research in the 1980s. This research almost died out in the mid 1990s. In this paper we argue that it is high time to reconsider this subject since a lot of things have changed. First and foremost, parallel processing has moved from a niche application to something mandatory for any performance critical computer applications. We will also point out that even very fundamental results can still be obtained. We give examples and also formulate some open problems.”


Privacy Risk Managementfor Federal Information Systems
This publication introduces a privacy risk management framework (PRMF) for anticipating and addressing privacy risk that results from the processing of personal information in federal information technology systems. In particular, this publication focuses on the development of two key pillars to support application of the PRMF: privacy engineering objectives and a privacy risk model. In so doing, it lays the foundation for the establishment of a common vocabulary to facilitate better understanding of, and communication about, privacy risks and the effective implementation of privacy principles in federal information systems. The set of privacy engineering objectives defined in this document provides a conceptual framework for engineers and system designers to bridge the gap between high-level principles and implementation.


Interview: Mike Lamble, CEO at Clarity Solution Group
“Better, cheaper, faster” is good. Schema-less writes, fitness for all data types, commodity hardware and open source software, limitless scalability – is also good. That said, out of the box Hadoop-based Data Lakes are not industrial strength. It’s not as simple as downloading the Hadoop software, installing it on a bunch of servers, loading the Data Lake, unplugging the enterprise Data Warehouse (EDW) and — voila. The reality is that the Data Lake architecture paradigm – which is a framework for an object-based storage repository that holds data in its native format until needed – oversimplifies the complexity of enabling actionable and sustainable enterprise Hadoop. An effective Hadoop implementation requires a balanced approach that addresses the same considerations with which conventional analytics programs have grappled with for years: establishing security and governance, controlling costs and supporting numerous use cases.


Google Create Kubernetes-based VM/Docker Image Building Framework
The Google Cloud Platform team have released a technical solution paper and open source reference implementation that describes in detail how to automate image builds via Google Compute Engine (GCE) using open source technology such as Jenkins, Packer, and Kubernetes. The reference implementation can be used as a template to continuously build images for GCE or Docker-based applications. Images are built in a central project, and then may be shared with other projects within an organisation. The Google Cloud Platform blog proposes that ultimately this automated image build process can be integrated as a step in an organisation's continuous integration (CI) pipeline.


Why “Agile” and especially Scrum are terrible
Under Agile, technical debt piles up and is not addressed because the business people calling the shots will not see a problem until it’s far too late or, at least, too expensive to fix it. Moreover, individual engineers are rewarded or punished solely based on the completion, or not, of the current two-week “sprint”, meaning that no one looks out five “sprints” ahead. Agile is just one mindless, near-sighted “sprint” after another: no progress, no improvement, just ticket after ticket. ... “Agile” and Scrum glorify emergency. That’s the first problem with them. They’re a reinvention of what the video game industry calls “crunch time”. It’s not sustainable. ... People will tolerate those changes if there’s a clear point ahead when they’ll get their autonomy back.


Big Data and the Future of Business
The point of Big Data is that we can do novel things. One of the most promising ways the data is being put to use is in an area called “machine learning.” It is a branch of artificial intelligence, which is a branch of computer science—but with a healthy dose of math. The idea, simply, is to throw a lot of data at a computer and have it identify patterns that humans wouldn’t see, or make decisions based on probabilities at a scale that humans can do well but machines couldn’t until now, or perhaps someday at a scale that humans can never attain. It’s basically a way of getting a computer to do things not by explicitly teaching it what to do, but having the machine figure things out for itself based on massive quantities of information.


Datameer adds governance tools for Hadoop analytics
Data silos are one potential consequence, as are regulatory-compliance risks when sensitive data sets are being used. Datameer’s new governance module is designed to give businesses transparency into their data pipelines while providing IT with tools to audit diligently for compliance with internal and external regulations. New data-profiling tools, for example, let companies find and transparently fix issues like dirty, inconsistent or invalid data at any stage in a complex analytics pipeline. Datameer’s capabilities include data profiling, data statistics monitoring, metadata management and impact analysis. Datameer also supports secure data views and multi-stage analytics pipelines, and it provides LDAP/Active Directory integration, role-based access control, permissions and sharing, integration with Apache Sentry 1.4, and column and row anonymization functions.


How UPS uses analytics to drive down costs
Putting it in perspective, the advanced math around determining an order of delivery is incredible. If you had a 120-stop route and you plotted out how many different ways there are to deliver that 120-stop route, it would be a 199-digit number. It’s so large mathematicians call it a finite number that is unimaginably large. It’s in essence infinite. So our mathematicians had to come up with a method of how to come up with an order of delivery that takes into account UPS business rules, maps, what time we need to be at certain places and customer preferences. It had to be an order of delivery that a driver could actually follow to not only meet all the business needs, but with fewer miles than they’re driving today. And this is on top of the 85 million miles we’ve already reduced.


CTO interview: Customer data analytics driving revenue growth at British Medical Journal
The analytics plans have involved investing in a number of tools. Among others, this includes including Google Analytics and AppDynamics, which is used to monitor user behaviour as well as a back office monitoring tool, Cooper said. ”We are using that a lot for performance and to be able to look at not just what an application is doing, but how we are using that to see what people are doing in the application,” she said. ... “Right now we are not in such a mess, but what we have got is so fragmented and we are just trying to work out what it is we need to track, what is the important data, what do we need to measure, because we have a lot of very industry specific data models that come with being an academic publisher.”


Safe Big Data
Data privacy has historically concentrated on preserving the systems managing data instead of the actual data. Since these systems have proven to be vulnerable, a new approach that encapsulates data in cloud-based environments is necessary. New algorithms must also be created to provide better key management and secure key exchanges. Data management concerns itself with secure data storage, secure transaction logs, granular audits and data provenance. This aspect must be concerned with validating and determining the trustworthiness of data. Fine-grained access controls along with end-to-end data protection can be used to verify data and make data management more secure.



Quote for the day:

"When you have exhausted all possibilities, remember this: You haven't." -- Thomas Edison

June 05, 2015

Co-operation driving progress in fighting cyber crime, say law enforcers
FBI assistant legal attaché Michael Driscoll said information security professionals in the private sector often see the evidence of cyber-enabled crime far quicker than law enforcement. He said it is important to engage with information security professionals as law enforcement becomes increasingly reliant on what they do on a daily basis for gathering the evidence they need. Driscoll said private organisations can help broaden law enforcement’s view and understanding of cyber-enabled crime. “Around 22,000 reports are made to the FBI’s internet crime complaint centre each month, but we think that is about 10% of what actually goes on. The volume is unbelievable,” he said.


FBI official: Companies should help us ‘prevent encryption above all else’
"Privacy, above all other things, including safety and freedom from terrorism, is not where we want to go," Steinbach said. He also disputed the "back door" term used by experts to describe such built-in access points. "We're not looking at going through a back door or being nefarious," he argued, saying that the agency wants to be able to access content after going through a judicial process. But many technical experts believe that building intentional vulnerabilities into the systems that people around the world rely on reduces the overall security of the entire digital system, even if done to comply with legal requirements.


The Innerworkings of a Security Operations Center
The SOC does not just consume data from its constituency; it also folds in information from a variety of external sources that provides insight into threats, vulnerabilities, and adversary TTPs. This information is called cyber intelligence (intel), and it includes cyber news feeds, signature updates, incident reports, threat briefs, and vulnerability alerts. As the defender, the SOC is in a constant arms race to maintain parity with the changing environment and threat landscape. Continually feeding cyber intel into SOC monitoring tools is key to keeping up with the threat. In a given week, the SOC likely will process dozens of pieces of cyber intel that can drive anything from IDS signature updates to emergency patch pushes.


Project Seeks to Combine Sustainable Fish Farm and Data Center
This is a rare example of a project that attempts to combine a data center with a completely unrelated facility in way that is mutually beneficial. Because a data center is a massive power and water consumer and a huge source of excess heat, people are often compelled to look for creative ways to utilize those aspects of mission critical facilities. Another example is a project in California’s drought-stricken Monterey County, where a group of entrepreneurs wants to combine a data center with a water desalination plant. The first initiative is the aquaculture facility, a fish farm that will produce 500,000 pounds a year of Mediterranean sea bass. A tech incubator is also planned for the site.


Uber CEO admits company is not perfect
Uber, Kalanick said, provides not just a cheaper, more efficient form of transportation that bests owning a car, regular taxis, or even public transit. The companys technology can also improve cities by getting more cars off the road and reducing pollution, he said. Uber’s service, which lets people hail a ride from their smartphones, is now active in more than 310 cities and nearly 60 countries around the world. In some countries, like Germany and India, Uber has wrestled with regulators over its legality. Kalanick also used the event to make a plea to mayors across the U.S., asking them not to deprive people the right to drive for Uber because of “some outdated regulation.” In the years ahead, Uber will continue to make changes to its service, particularly around the company’s low-cost UberX option, so that using Uber is cheaper than owning a car, Kalanick said.


Put microservices, cloud at heart of your IoT strategy
Users still have to collect IoT data, but also index and store it for easy access. Additionally, this model requires organizations to address IoT security at the cloud level, rather than the network level. Cloud assets growing underneath applications without direct application involvement -- as IoT assets do, since sensors are not part of user applications -- also requir e special planning to address data currency and to support synchronized analysis of multiple IoT sources. While current practices can likely address this, IoT application scale may prove challenging. A database and microservice IoT approach also offers better support for privacy and public policy limits. Because query patterns are directly visible, IoT systems based on microservices and queries make it easier to detect attempts to track a person's location.


"Arrogant" datacentre operators blasted by users for poor customer service approach
“If we’d asked that same question three years ago, the answer would have been cost or location, but the reason for that is because no matter what the datacentre service is – whether it be co-lo, hosting, cloud or managed service – people’s understanding of the market is so much greater now and their expectations are higher.” Because of the contractual and technology complexities involved in moving to a new datacentre supplier, users have traditionally felt inclined to make do with the service they receive, but that’s not necessarily the case anymore. “It’s difficult and disruptive to move, because moving a sizeable IT estate is complex and businesses can’t take the downtime, and it’s very expensive,” said Rabbetts.


Sharing Data, but Not Happily
Companies that are more transparent about why they collect certain customer details and how they use them may find it easier to maintain customer trust. Certainly, millions of people have signed up for store loyalty cards and frequent-flier programs that offer deals or upgrades based on consumers’ purchases. And for the many people who relish personalized services, the idea that Amazon, Facebook, Google Maps or Pandora may remember and learn from their preferences represents an advantage, not a problem. “People are always willing to trade privacy and information when they see the direct value of sharing that information,” said Mike Zaneis, the chief counsel for the Interactive Advertising Bureau, an industry group in Washington.


Flocker Tutorial: Migrating a Stateful Dockerized ElasticSearch-Logstash-Kibana Stack
Flocker is an open-source data volume manager designed exclusively for the purpose of managing stateful services running in containers. As of this writing, Flocker is at release 0.4, meaning that the project is under active development. In the coming weeks and months, we will be adding an API so that you can do everything described in this tutorial programmatically. Additionally, we are building Flocker from the ground up to work with other popular container management tools, such as Docker Swarm, Google Kubernetes and Apache Mesos. We’ve recently published a tutorial on how to use Flocker with Docker Swarm and Kubernetes, so if you are a user of either of those tools, I’d encourage to try out our integration demos.


How to hire for personality and train for skills
"Of course you need people who know the fundamentals of their job, but when your people come across problems, it's important that they see them as just obstacles and roadblocks on the way to overall success; conceptual thinking and abstraction is at the core of this," Jersin says. As important as it is for talent to focus on their own contributions to your products and services, it's also critical that they can see how their part fits into the larger whole. "You want people who can hit their own personal targets, but also keep the big picture -- the company's overall success, development and growth -- in mind as well," says Labourey.



Quote for the day:

“Instead of focusing on how much you can accomplish, focus on how much you can absolutely love what you’re doing.” -- Leo Babauta

June 03, 2015

12 Quick Tips about Application Level Performance Testing and More
In an economy where apps have become the very heart and soul of almost any business, you have less than one second to impress your user. Because of this limited impression availability, application performance is essential to ensure the quality of your customer's digital experience and your user loyalty. Application Performance Management tools and methods are indispensable in ensuring application performance in real, live environments. Testing application performance should be initiated as early as possible in the application development lifecycle to avoid poor performance and ensure user customer retention.


16 cool things to try with the new Google Photos
Scroll down below the faces in that same search screen, and you'll find a list of locations in which your photos have been taken. What's particularly remarkable about this is that it works even if you don't have location reporting activated, as is the case for me. How? Google says its technology is able to recognize known geographical landmarks from photos and then use logic (and the laws of physics) to infer your location in other nearby photos. If you took a snapshot of the Eiffel Tower on February 9th at 2 p.m., for instance, Google can safely assume you were still in Paris in that selfie you took in front of a bakery 45 minutes later. The accuracy and level of detail may surprise you.


When stolen data turns up on the dark web, this tech can find it fast
"There will always be a path out of your network through an advanced or insider threat," said co-founder Danny Rogers in a phone call last week. "There is no defense that's perfect. If you can't stop everything, what else can you do? That's when we started to focus on immediate threat detection," he said. Rarely do like red flags appear on a screen inside a company's firewall warning that its systems have been breached. In reality, most data breaches are discovered because someone stumbles across stolen data in an underground forum, up for sale to the highest bidder. Rogers, and his co-founder Michael Moore, said that using large-scale cloud-based automation to search for this data can considerably cut down on how long it takes to discover breaches.


CIOs future-proof the data center with hybrid strategies
Over the next three to five years, I'd say 80% of our services will be in the cloud. But there will always be a need for services on campus," he said. For example, he plans to keep the door-locking and fire alarm systems in his own data center, where he isn't reliant on a connection to a cloud provider for them to work. His closed-circuit security system, with its high demand on bandwidth, will stay in his data center, too. Hybrid strategies like Haugabrook's also typically require staff training and reassignment of IT roles. Haugabrook said he plans to transition his staff to roles focusing on automating and integrating systems.


Preparing Data for the Self-Service Analytics Experience
Frequently, all users have to work with are spreadsheets and limited reports based on disparate, application-specific databases. Self-service BI and data discovery tools can deliver much better visualization and data exploration, but the sources often remain limited to spreadsheets and siloed application-specific databases. At larger firms, even if there is an enterprise BI standard, users grow tired of waiting: waiting for IT to find development time to address requests for BI reports and dashboards and waiting for IT to find systems time to run the reports and queries. Of course, once this is all set up, users frequently decide that they want different data or different queries and visualizations and the process must start over.


Private Cloud: A Secure Alternative to Public Clouds?
Private clouds do have challenges, especially if on-premises IT is responsible for managing it, which requires the same staffing, management, maintenance and capital expenses as a traditional data center. However, a common misconception is that private clouds always run on client premises, in the client’s own data center. In reality, there are many providers that deploy, host and manage private cloud infrastructure and solutions. A business might also choose a mix of private and public cloud services, called “hybrid” cloud deployment. In fact, Gartner predicts that the majority of private cloud deployments will eventually become hybrid clouds, meaning they will leverage public cloud resources.


Apple Watch Fails To Ignite Wearables Market, Yet
Fitbit ranked as the number one wearable maker by volume. It shipped 3.9 million devices, giving it 34.2% of the market. Xiaomi followed with 2.8 million devices and 26.4% of the market. Garmin rounds out the top three with 700,000 devices and 6.1% of the market. All three companies make low-cost devices (~$100) meant to help track health and fitness. "Bucking the post-holiday decline normally associated with the first quarter is a strong sign for the wearables market," Ramon Llamas, and IDC research manager for wearables, wrote in the June 2 report. "It demonstrates growing end-user interest and the vendors' ability to deliver a diversity of devices and experiences. In addition, demand from emerging markets is on the rise and vendors are eager to meet these new opportunities."


Big Data, Bigger Responsibility
“Companies of all sizes and in virtually every industry are struggling to manage the exploding amounts of data,” says Neil Mendelson, vice president for big data and advanced analytics at Oracle. “But as both business and IT executives know all too well, managing big data involves far more than just dealing with storage and retrieval challenges—it requires addressing a variety of privacy and security issues as well.” In a talk at the Technology Policy Institute’s 2013 Aspen Forum, Federal Trade Commission chairwoman Edith Ramirez described some big data pitfalls to be avoided. Though many organizations use big data for collecting non-personal information, there are others that use it “in ways that implicate individual privacy,” she noted


White collar automation will bring new industrial revolution, says CEO
Change is hard, and we shouldn't be naïve about it. But there are two components here. One is productivity, which we all understand will be there, and the other is about progress. I often give people the example of a construction site. If you pass any construction site today, you will see highly specialized machines. Cranes, fork lifts, bull dozers, and people working alongside them. Ultimately it is going to be about robot human partnership. And if you look at that construction worker, productivity is through the roof, and that allows them to construct things we never thought possible. That's progress.


Why We Fail to Change: Understanding Practices, Principles, and Values Is a Solution
There is no simple answer to the question of why we fail to change – at least not in a form of a recipe. In fact, we have plenty of recipes and they are one the key reasons why we’ve kept repeating the same mistakes for more than 40 years. It’s not only that we have plenty of recipes but also how we’ve codified them – and then, of course, started certifying people. The end result is that it is easy for organizations to simply choose a method from a menu and expect everyone to comply with the method – and expect to repeat a success story. It’s not much of a surprise that it doesn’t work.



Quote for the day:

“Always forgive your enemies; nothing annoys them so much.” -- Oscar Wilde

June 02, 2015

4 tips to help CEOs find their CIO soulmate
Businesses can't be successful without a strategy, and neither can a CIO. When finding the right CIO for your company, you should spend time discussing the overall strategy of your company to make sure your ultimate goals align. Exceeding expectations as a CIO five years ago, might qualify as simply meeting expectations today. As Sagalov puts it, "a CIO is someone who does more than just keep the Wi-Fi on -- he is responsible for strategically growing a company's information capabilities." A CIO needs to have a strategy that will allow the company to thrive and adapt. A strategic CIO is a proactive CIO, and it is important to ensure that the person you hire is willing to plan for the future, rather than react as it happens.


Security breaches a monthly headache for firms
Virus attacks were the most common type of security issue, reported by 81 percent of large companies. But over half (57 percent) had been targeted by phishing attempts, a third (37 percent) had seen a denial of service attack, and nearly one in four (24 percent) said their networks had been breached by hackers. "Considering all breaches, there was a noticeable 38 percent year-on-year increase of unauthorised outsider attacks on large organisations, which included activities such as penetration of networks, denial of service, phishing and identity theft," the report noted. Businesses are pessimistic about their abilities to keep crooks out: over half expected to see more breaches in future.


Empower Your Application Teams
Application teams are the backbone of the revenue generating capabilities of all companies. These creative people spend their off-hours working with the latest tools to keep their skills sharp. Increasingly, these teams are going into the public cloud to develop their applications. Why? Developers need resources in days. Procurement and IT quote resource delivery in weeks. As the pace of business increases, developers need tools that help them accelerate the design and deployment of applications. They want resources quickly, on-demand. Watch this video to learn how Cisco meets the need for on-demand resources.


7012 Regs and Cyber insurance on collision course with small business
The 7012 regulations also require immediate reporting of any incident or threat to UCTI that is carried on or held in an IT system. The NIST is the cognizant agency for Classified standards and operational regulations. The regulations themselves are a part of, and a driver to, a set of complex problems for industry — presently, with risk being transferred away from DoD to its contractors who will find risk rebounding to them via their “cyber” insurance policies. This two-part article isn’t intended to fan the flames, but rather to give the context behind the regs, provide meaningful definitions for practical use, offer probable implications for industry, and set out why the seemingly most reasonable solution for businesses may be the most dangerous to them.


The drivers and inhibitors of cyber security evolution
“Coupled with the coming regulations that will require mandatory breach notification, it is surprising that many are still prioritising the same things they have always done, rather than evolving to ensure they can respond to threats that get through their current defences,” says FireEye European vice-president and chief technology officer (CTO) Greg Day. “Many organisations talk the talk, but want to walk the walk at a very slow and steady pace, while the most enlightened organisations are already spending more than 50% of their security budgets on progressive detection and response capabilities,” he says. The study found organisations are, on average, spending only 23% of their IT security budgets on detection and response, although this expected to increase to 39% in the next two years.


MongoDB Gets SQL-Reporting Capability
MongoDB engineers have built a connector that takes a standard SQL query from Business Objects, Tableau, Cognos, or other SQL-based analysis systems, and translate it into a query that MongoDB understands. MongoDB is popular as a JSON document database, capable of storing email, reports, comments, and other forms of text as objects in a database. It has used its own query and reporting methods that in the past have been incompatible with SQL reporting systems, the ones that most data managers were familiar with, explained Kelly Stirman, vice president of strategy at MongoDB.


The Power of RAML
RAML, or the RESTful API Modeling Language, is a relatively new spec based on the YAML format- making it easily read by both humans and machines. But beyond creating a more easily understood spec (one that could be handed to the documentation team without worry), Uri Sarid, the creator of RAML wanted to push beyond our current understandings and create a way to model our APIs before even writing one line of code. ... All of the RAML tools are open source and freely available at http://RAML.org/projects.... The API Notebook, on the other hand, takes interactivity and exploration a step further by letting developers use JavaScript to call your (and other) APIs. In the API Notebook they are also able to manipulate the data to see how it would work in a real-world use case.


CIOs need to plan and prepare for disruption
Whichever continent, whichever industry and whichever demographic or social group you look at, the signs of disruption are everywhere and it’s accelerating at a frenetic pace driven by a new wave of global 21st century entrepreneurs who last year registered over 100 million new companies and who are all being powered by the same new democracy, .... Today, technology is becoming increasingly ubiquitous and as it does so new social, mobile and cloud based technology platforms are helping people around the world collaborate and communicate easier and faster than ever before, find funding, information and expertise faster than ever before and helping them create, build, distribute and sell new products and services faster than ever before.


Customer-obsessed technology platforms: If you don't know, you're doing IT wrong
Unfortunately this is a terrible way to create applications, regardless if it's on the web, mobile, or any other emerging digital channel. The data is good, but we cannot start with our data in mind -- instead we must start with our customers' needs in mind. But why this change and why now? Our customers (and increasingly our employees) are being presented with so many more options from your competitors, both those known today and tomorrow's digital startups. Simply put, the barrier to creating new software solutions is approaching zero. Making this transformation is central to the BT Agenda -- applying technology to win, serve, and retain customers.


The Secret to Data Lake Success: A Data First Strategy
So why is the data warehouse failing to deliver on these requirements? The organization spent a lot of time and money to create a “slice and dice” environment that should give the business what they need. Unfortunately, in today’s environment, accounting for every question in one model is impossible. New data sources are emerging at a breakneck pace. New questions are sprouting up even faster. A highly engineered environment that only takes the data it needs a upfront is going to have difficulty adapting to rapidly changing requirements.  Faced with the complexity of data loading and transformation processes, as well as a highly intricate data model, the average data warehouse change takes nine months and costs over $1 million to complete. When you build a complex system with one purpose in mind, don’t expect agility.



Quote for the day:

“If you cannot do great things, do small things in a great way.” -- Napoleon Hill

June 01, 2015

5 ways to find and keep customer-focused IT pros
One option is to bring in workers with industry experience who can take on business-focused IT roles. ... Among people with those types of backgrounds, he said he looks for “attitude and passion to go after unsolved problems.” Another way to find people who will excel at working with customers is to get a sense of how they would solve a business problem with technology. Aaron Gette, CIO of Bay Clubs, a luxury fitness and country club company, said he cares less about titles and hot IT roles and more about intangible qualities. “I’m looking for nontraditional IT people. They like to talk to people, not just on social media, but actually socializing and being involved in initiatives,” he said. “They need to be involved in member forums and understand what’s working with our programs.”


Organisations are changing how they spend their cyber security budget
“Firms are coming to terms with the inevitability of a cyber breach,” said Duncan Brown, research director at PAC. “Rather than spending a majority of security budget on prevention, firms will apply a more balanced approach to budgeting for cyber attacks.” ... It’s vital that organisations find the right balance between prevention and response. An organisation that puts all its eggs in one basket and solely spends on prevention will find itself in a tough situation when it inevitably suffers a breach, ditto for those that spend solely on response. To find the right balance, organisations needs to implement a framework that combines prevent and protect, and detect and respond – and enables them to work together.


Deep Learning Catches On in New Industries, from Fashion to Finance
“One of the things Baidu did well early on was to create an internal deep learning platform,” Ng said. “An engineer in our systems group decided to apply it to decide a day in advance when a hard disk is about fail. We use deep learning to detect when there might’ve been an intrusion. Many people are now learning about deep learning and trying to apply it to so many problems.” Deep learning is being tested by researchers to glean insights from medical imagery. Emmanuel Rios Velazquez, a postdoctoral researcher at the Dana-Farber Cancer Institute in Boston, is exploring whether deep learning could help to more accurately predict a patient’s outcome from images of his or her cancer.


Private Cloud: Insurers' Secure Solution
“The insurance industry has often been slower to adopt public cloud than many other industries because of regulations of how data needs to be managed,” says Jeffrey Goldberg, vice president of research and consulting at Novarica. “Some of those are legitimate concerns about data security and some of it is also fear-based — they would like to get the advantages of public cloud but want to maintain control.” Private cloud models — which are either implemented on-premises behind a corporate firewall, or off-premises but within the client’s firewall and dedicated solely to the client — have begun to address insurers’ security concerns. This sector is growing fast: Research firm Technology Business Research forecasts 35% growth in the private cloud sector in 2015.


The Internet of Things Will Give Rise To The Algorithm Economy
Data is the oil of the 21st century. But oil is just useless thick goop until you refine it into fuel. And it’s this fuel – proprietary algorithms that solve specific problems that translate into actions – that will be the secret sauce of successful organizations in the future. Algorithms are already all around us. Consider the driver-less car. Google’s proprietary algorithm is the connective tissue that combines the software, data, sensors and physical asset together into a true leap forward in transportation. Consider high frequency trading. It’s a trader’s unique algorithm that drives each decision that generates higher return than their competitors, not the data that it accesses. And while we’re talking about Google, what makes it one of the most valuable brands in the world? It isn’t data; it’s their most closely guarded secret, their algorithms.


Q&A with Benjamin Wootton on DevOps Landscape in 2015
DevOps from an automation perspective is now big enough a field to be a specialist area. There are so many tools and the space is moving quickly that people can concentrate on it full time and deliver competitive advantage to their businesses. Having a team of people working with these tools on these type of activities can really work and help all of the developers and testers to go faster, leveraging up their value to the business.  I like to see senior engineers in this DevOps team who bring a DevOps mindset and career experience across dev, test and operations. They can then go out and coach other staff members onto the central automation platform, ideally giving those teams an increasing amount of ownership of the automation.


Cloud computing more about agile development than cost
"For CIOs, the message is clear: Shift into the driver seat, or others will," Forrester said in releasing its cloud forecast. "A lot of enterprises are voting with their budgets and they're adopting cloud across the board," Rymer says Small wonder then, that for many organizations, the first question about the cloud is a settled matter -- not a question of if, but when, and how. ... "The bottom line here is ... cloud is the next platform," Rymer says. "We don't get a lot of questions from clients anymore about whether or not they're going to go to public clouds. It's really how do we get there." So how do they get there? To Forrester, it is essential to bridge the gap between the IT shop and the business lines of an organization.


The enterprise technologies to watch in 2015
The new technologies on the list include a few that aren't well-known but I believe represent either key advances likely to grow in strategic importance (machine learning, data science), or new developments that offer very significant benefits tactically with relatively little effort to realize (containers,instant app composition, machine-to-machine systems.) There are also a few long-standing categories which have re-emerged recently as leading areas of technology focus for most organizations with new approaches, or have actually developed into parallel tracks with different levels of impact, often with a clear separation of efforts within many companies (hybrid cloud and commercial public cloud, for example.) I've also consolidated some of last year's items as well, as explained above.


Salesforce teams up with Google and others to breakdown big data tech barriers
“Salesforce Wave for Big Data connects the Analytics Cloud to the industry’s most comprehensive ecosystem of big data innovators. Now every company can extend any data source to business users to transform every customer relationship,” he added.  Google’s contribution will tackle the volume piece of the big data equation by allowing users to run advanced queries on their datasets, while Cloudera will provide users with a centralised hub where their information can be stored and analysed securely. Meanwhile, New Relic’s software analytics platform is being introduced to tackle velocity, by providing users with a means of deriving real-time information about the performance of a company’s web and mobile apps.


AI Supercomputer Built by Tapping Data Warehouses for Their Idle Computing Power
Data centers often have significant numbers of idle machines because they are built to handle surges in demand, such as a rush of sales on Black Friday. Sentient has created software that connects machines in different places over the Internet and puts them to work running machine-learning software as if they were one very powerful computer. That software is designed to keep data encrypted as much as possible so that what Sentient is working on–perhaps for a client–is kept confidential. Sentient can get up to one million processor cores working together on the same problem for months at a time, says Adam Beberg, principal architect for distributed computing at the company. Google’s biggest machine-learning systems don’t reach that scale, he says.



Quote for the day:

“No great manager or leader ever fell from heaven, its learned not inherited.” -- Tom Northup