Showing posts with label COBIT 5. Show all posts
Showing posts with label COBIT 5. Show all posts

Daily Tech Digest - August 06, 2019

Evolution of the internet: Celebrating 50 years since Arpanet

NW_internet 50th anniversary2
Daily traffic on the Internet surpassed 3 million packets in 1974. First measured in terabytes and petabytes, monthly traffic volume is now measured in exabytes, which is 1018 bytes. In 2017, the annual run rate for global IP traffic was 122 exabytes per month, or 1.5 zettabytes per year, according to Cisco’s Visual Networking Index. Annual global IP traffic will reach 396 exabytes per month, or 4.8 zettabytes per year, by 2022, Cisco predicts. As traffic volume has grown, so too has the number of devices connected to the internet. Today, the number of devices connected to IP networks is approaching 20 billion. By 2022, there will be 28.5 billion networked devices, up from 18 billion in 2017, Cisco predicts. That’s more than the number of people in the world. Overall, Cisco predicts there will be 3.6 networked devices per person by 2022, up from 2.4 in 2017. Today, smartphone traffic continues to grow and is poised to exceed PC traffic in the coming years. In 2018, PCs accounted for 41% of total IP traffic, but by 2022 PCs will account for only 19 percent of IP traffic, according to Cisco’s data. Smartphones will account for 44 percent of total IP traffic by 2022, up from 18% in 2017.


Why Every Developer Should Know a Bit of Technical Writing

First, technical writing can help you communicate more easily with your teammates. If you’re collaborating with other software developers on a regular basis, you know the importance of exchanging ideas, ensuring you’re working for the same high-level goals, and solving problems together. Technical writing abilities help you formally structure these bits of communication so your coworkers can better understand them; with an efficiently written message, you can avoid most misconceptions and ultimately work faster. You can also use your technical writing abilities to communicate with out-groups more efficiently, especially if those groups have limited technical knowledge. Rather than using terms unique to the development field, or describing code directly, you’ll have to find high-level ways to describe the challenges you’re facing, or use metaphors so that other people can grasp what you’re saying. Either way, you’ll be more valuable in client meetings, and you’ll be able to talk to account managers and team leaders in other departments in a way that makes sense to them, while still conveying what you need to convey.


Are developers honestly happy working 60-hour weeks?


The annual Stack Overflow survey is one of the most comprehensive snapshots of how programmers work, with this year's poll being taken by almost 90,000 developers across the globe. Commenting on the data, Robert Pozen, senior lecturer for technological innovation, entrepreneurship, and strategic Management at MIT Sloan School of Management, said although many "white-collar professionals" are content to work for longer than the standard 40-hour week, working hours can only be extended so far before it will negatively affect them. "Many professionals are quite happy working 40 to 55 hours per week," he says. "But if professionals work for 70 to 80 hours per week on a regular basis, their productivity will gradually deteriorate on average. They will lose focus, and the long work hours will undermine the rest of their lives. "Of course, professionals can have fruitful work spurts on projects they like or think are important. But that is the exception, rather than the rule." For developers, that fall in productivity is often mapped to an increase in poor quality and buggy code that will need to be fixed at some point, actually costing companies more in the long run.


What Millennials Think Of Boomers & Vice Versa

As with many misunderstandings at work, generational or otherwise, it’s always a good idea to take a step back and look for the upsides. Downsides are easy to find. (It’s why there are so many misunderstandings!) So the next time you find yourself looking across the generational divide with misgivings, here are some upsides to keep in mind about all the generations. Millennials owe a debt of gratitude to Gen X-ers for bringing a new generational identity to the workplace, one in which self-sufficiency and resourcefulness are highly valued, along with minimal management and maximum independence. This, combined with a bit of Gen X cynicism, paved the way for the Millennial perspective. Other Millennial advantages come from the time in history in which they grew up. For example, I’ve been surprised repeatedly by the exposure to other cultures that young people in this generation have had — high school students who spend a summer studying in South Korea, college students who opt for a gap year in Hungary, or who head to Ghana to work construction.


Evolution in action: How datacentre hardware is moving to meet tomorrow’s tech challenges


A demonstration system used separate memory and compute “bricks” (plus accelerator bricks based on GPUs or FPGAs) interconnected by a switch matrix. Another example was HPE’s experimental The Machine. This was built from compute nodes containing a CPU and memory, but instead of being connected directly together, the CPU and memory were connected through a switch chip that also linked to other nodes via a memory fabric. That memory fabric was intended to be Gen-Z, a high-speed interconnect using silicon photonics being developed by a consortium including HPE. But this has yet to be used in any shipping products, and the lack of involvement by Intel casts doubts over whether it will ever feature in mainstream servers. Meanwhile, existing interconnect technology is being pushed faster. Looking at the high performance computing (HPC) world, we can see that the most powerful systems are converging on interconnects based on one of two technologies: InfiniBand or Ethernet.


Developers Are More Remote-Based, Company Connected & Burnt Out


Remote work is the new normal for developers. It's not only something they prefer, but something they increasingly demand from employers. Eighty-six percent of respondents currently work remotely in some capacity, with nearly 1/3 working from home full time. Forty-three percent say the ability to work remotely is a must-have when considering an offer with a company. The traditional narrative of remote workers as isolated and disengaged from their companies is proving false for many. Seventy-one percent of developers who work remotely said they feel connected to their company’s community. But the issue hasn’t disappeared entirely. The twenty-nine percent who don’t feel connected say they feel excluded from offline team conversations or don’t feel integrated into their company’s culture when working remotely. The burnout problem is real. Two-thirds of all respondents said their stress levels have caused them to feel burnt out or work fatigued, regardless of whether or not they work remotely. Developers expect remote work to improve work-life balance. But the reality doesn’t always line up with that hope.


Think beyond tick-box compliance


According to Holt, compliance, alongside the need to recognise and leverage the business value of data, are data control challenges. In her experience, viewing them in this way makes the alignment of business and compliance objectives much less of a problem. “Organisations can begin to identify existing use cases and processes that depend on this control, and form interdisciplinary teams involving stakeholders from both compliance and other business roles to collaborate on shared outcomes and objectives. From this comes shared processes and workflows, shared technology, and – to some extent – shared budgets. By intertwining compliance goals within the broader enterprise initiative for data control and value realisation, there’s the potential for compliance to cease being a cost centre over time,” says Holt. “Benefits, such as improved customer relations and consumer trust, provide ‘softer’ returns that are often difficult to quantitatively measure over a short-term period, but can be significant and should not be neglected in calculations,” she adds.


The Phantom Menace in Unit Testing

Let me state up front that this is not a rant about unit testing; unit tests are critically important elements of a robust and healthy software implementation. Instead, it is a cautionary tale about a small class of unit tests that may deceive you by seeming to provide test coverage but failing to do so. I call this class of unit tests phantom tests because they return what are, in fact, correct results but not necessarily because the system-under-test (SUT) is doing the right thing or, indeed, doing anything. In these cases, the SUT “naturally” returns the expected value, so doing (a) the correct thing, (b) something unrelated, or even (c) nothing, would still yield a passing test. If the SUT is doing (b) or (c), then it follows that the test is adding no value. Moreover, I submit that the presence of such tests is often deleterious, making you worse off than not having them because you think you have coverage when you do not. When you then go to make a change to the SUT supposedly covered by that test, and the test still passes, you might blissfully conclude that your change did not introduce any bugs to the code, so you go on your merry way to your next task.


Evaluate the COBIT framework 2019 update


ISACA updated every part of the COBIT framework for 2019. The changes and additions to COBIT 2019 are encapsulated within the COBIT document suite, which is available to ISACA members for free. The principal changes include a new publication within the core framework, several new objectives, security practices updates and updated references to other standards, guidelines and regulations. Four core publications express the COBIT framework. The introduction and methodology publication provides definitions, explains management objectives and lays out the COBIT framework's structure. The governance and management objectives publication details the COBIT model and all constituent governance and management objectives, each associated with a specific process. A design publication, which is new in COBIT 2019, offers practical and prescriptive guidance that enables adopters to put COBIT into practice within the specific needs of their organizations.


Lessons Learned From A Year Of Testing Web Platform

Certain kinds of failures had side-effects that we didn’t anticipate. Even though our fancy automatic recovery mechanisms kicked in, the workers were doomed to fail all subsequent attempts. That’s because the unexpected side-effects persisted across independent work orders. The most common explanation will be familiar to desktop computer users: the machines ran out of disk space. From overflowing logs and temporary web browser profiles, to outdated operating system files and discarded test results, the machines had a way of accumulating useless cruft. It wasn’t just storage, though. Sometimes, the file system persisted faulty state. This entire class of problem can be addressed by avoiding state. This is a core tenet in many of today’s popular web application deployment strategies. The “immutable infrastructure” pattern achieves this by operating in terms of machine images and recovering from failure by replacing broken deployments with brand new ones. The “serverless” pattern does away with the concept of persistence altogether, which can make sense if the task is small enough.



Quote for the day:


"If you want extraordinary results, you must put in extraordinary efforts." -- Cory Booker


Daily Tech Digest - January 16, 2019

The Rise of Automated Machine Learning


AI and machine learning require expert data scientists, engineers, and researchers, and there's a worldwide short supply right now. The ability of autoML to automate some of the repetitive tasks of ML compensates for the lack of AI/ML experts while boosting the productivity of their data scientists. By automating repetitive ML tasks -- such as choosing data sources, data prep, and feature selection -- marketing and business analysts spend more time on essential tasks. Data scientists build more models in less time, improve model quality and accuracy, and fine-tune more new algorithms. More than 40 percent of data science tasks will be automated by 2020, according to Gartner. This automation will result in the increased productivity of professional data scientists and broader use of data and analytics by citizen data scientists. AutoML tools for this user group usually offer a simple point-and-click interface for loading data and building ML models. Most autoML tools focus on model building rather than automating an entire, specific business function such as customer analytics or marketing analytics.


Model-driven RESTful API for CRUD and More

This article introduces a model-driven RESTful API for CRUD (Create, Read, Update, Delete). With it, you can write simple models (specifying a database table and the set of columns to be exposed) and the REST endpoints for CRUD will become available automatically. No hand-coding of any SQL is necessary. The concept could be implemented on different technology stacks and languages. Here, I used JavaScript (which generates SQL) with Node.js, Express, and PostgreSQL. Most projects need to Create, Read, Update, and Delete objects. When these objects are simple enough (one driving table and a few columns in the database), the code is very similar from one object to the next. In fact, the patterns are the same, and the only differences are the names of the tables and the names and types of the columns. Of course, there will always be complex endpoints which need to be written by hand but by automating the simple ones, we can save a lot of time.


Progressing beyond a pre-digital age: Building the business case for ‘digital HR’

Progressing beyond a pre-digital age: Digital HR image
Humans are, well, only human. Mistakes happen, but a mistake can have a huge impact on an organisation’s health and future success. Introducing technology to manage a range of processes can help to reduce and mitigate HR related risk by minimising all manner of issues from poor HR consistency and visibility, to data loss. Manually updating changes in spreadsheets can be a cumbersome and ineffective process, especially when the data is being entered into multiple documents. Research from Salesforce shows that 88% of all spreadsheets have significant errors in them. Applying intelligent automation will not only reduce the risk of human mistakes but also help to flag errors and data problems before they create a negative impact on the business. The huge issue of risk and compliance aside, automation reduces the HR admin mountain and allows a focus on people strategies which are so critical when competing for talent and reducing churn. 


Get ready for edge computing’s rise in 2019

While many of you may see edge as exclusive to IoT, its value is much wider and will prove as critical to driving up customer experience as content delivery networks (CDN) were in the early days of the web . . .which explains why you are now seeing edge compute and AI services from all the major cloud vendors and on the road maps of the leading telecom companies. Twenty-seven percent of global telecom decision makers, who responded this year to the Forrester Analytics Global Business Technographics® Mobility Survey, 2018, said that their firms are either implementing or expanding edge computing in 2019. Many of these vendors will require new wireless tools and updated skill sets to achieve this digital transformation. This aligns to Verizon's recent employee buyout offer, as a result of which over 10,400 of its staff will be gone next year, driving nearly $10 billion in savings that it can apply to its edge-compute-empowered 5G network. And speaking of CDNs, nearly every one of these vendors is adding edge compute to their core market values.


World's first robot hotel massacres half of its robot staff

Terminator head
The story highlights the shortcomings of purportedly “state of the art” AI automation that are rarely discussed. One is that they’re installed to solve a management problem rather than a customer need, as was the case here - the hotel is in an area with an acute labour shortage. Secondly, they’re just plain annoying. As hotel manager Hideo Sadawa explained: “When you actually use robots you realize there are places where they aren’t needed - or just annoy people”. While robotics has advanced steadily in industry, the picture is different in consumer electronics. Trade group the International Federation of Robotics noted that sales of industrial robots had doubled in five years. But it’s largely cyclical, IFR president Junji Tsuda admitted. Adoption doubled even more dramatically between 2009 and 2010, which had nothing to do with AI and a lot to do with the falling cost of sensors and microelectronics. In industries where automation is highly advanced, such as car production, it may not move the dial much: wage rates largely govern the substitution phenomenon


The Key Cybersecurity Takeaways From The Recent SEC Charges

The Key Cybersecurity Takeaways From The Recent SEC Charges
Hackers continue to prefer phishing schemes to almost any other infiltration or social engineering tactic. In part, their effectiveness ties into their mundanity; phishing attacks look like legitimate emails, and employees without proper training will reliably open their emails. Phishing attacks, therefore, provide a low effort, high impact cyber threat. Furthermore, if it can hit the SEC, it can hit your enterprise as well. To prevent a phishing attack from inflicting damage on your databases, make sure your employees can recognize a phishing attack if they receive one; there are tell-tale signs for almost all of them. Incentivize recognizing phishing attacks before they occur, either through a small rewards program or by making cybersecurity a part of your employees’ everyday job duties and performance reviews. Additionally, ensure your cybersecurity platform includes a SIEM solution with strong threat detection capabilities. Your enterprise can also benefit from an email security solution to prevent phishing attacks from reaching your inboxes.


Major Security Breach Discovered Affecting Nearly Half of All Airline Travelers


With the PNR and customer name at our disposal, we were able to log into ELAL’s customer portal and make changes, claim frequent flyer miles to a personal account, assign seats and meals, and update the customer’s email and phone number, which could then be used to cancel/change flight reservation via customer service. Though the security breach requires knowledge of the PNR code, ELAL sends these codes via unencrypted email, and many people even share them on Facebook or Instagram. But that’s just the tip of the iceberg. After running a small and non-threatening script to check for any brute-force protections, none of which were found, we were able to find PNRs of random customers, which included all of their personal information. We contacted ELAL immediately to point out the threat and prompt them to close the breach before it was discovered by anyone with malicious intentions. We suggested stemming the vulnerability by introducing captchas, passwords, and a bot protection mechanism, in order to avoid using a brute-force approach.


What is COBIT? A framework for alignment and governance

New concepts and terminology have been introduced in the COBIT Core Model, which includes 40 governance and management objectives for establishing a governance program. The performance management system now allows more flexibility when using maturity and capability measurements. Overall, the framework is designed to give businesses more flexibility when customizing an IT governance strategy. Like other IT management frameworks, COBIT helps align business goals with IT goals by establishing links between the two and creating a process that can help bridge a gap between IT — or IT silos — and outside departments. One major difference between COBIT and other frameworks is that it focuses specifically on security, risk management and information governance. This is emphasized in COBIT 2019, with better definitions of what COBIT is and what it isn’t. 


The report on the security analysis of radio remote controllers for industrial applications highlights notes the use of obscure, proprietary protocols instead of standard ones makes controllers vulnerable to command spoofing, so an attacker can selectively alter their behaviour by crafting arbitrary commands, with consequences ranging from theft and extortion to sabotage and injury. “The legacy and widespread RF technology used to control industrial machines is affected by serious security issues that impact several market verticals, applications, products and brands,” the report said. The researchers warned that currently and widely used legacy RF technology for industrial applications can be abused for sabotage of equipment, theft of goods by manipulating equipment and extortion by demanding payment to hold off or cease equipment interference.


Getting Started with PouchDB - Part 1

PouchDB is an open-source JavaScript NoSQL database designed to run offline within a browser. There is also a PouchDB server version that can be used when online. These two databases synchronize from one to another using a simple API call. You may also use CouchDB on the server to synchronize your data. A NoSQL database is storage where there is no fixed table structure as in a relational database. There are a few different methods NoSQL databases use to store data: column, document, Graph, and key-value pair. Of these, the most common are column and document. PouchDB supports document-oriented where data in the model is stored as a series of JSON objects with a key value assigned to each document. Each document in PouchDB must contain a property called _id. The value in the _id field must be unique per database. You may use any string value you want for the _id field. In this article, I am going to use a value that is very simple.



Quote for the day:


"Your talent and giftedness as a leader have the potential to take you farther than your character can sustain you. That ought to scare you." -- Andy Stanley


Daily Tech Digest - July 21, 2017

Big Data Technology: In-House vs Outsource

For any technological venture, speed to market is key to determining overall success. This includes the development of internal technology. From project inception to launch, creating a big data solution can take as much as 2-3 full years. That’s two-plus years for a solution you need today. And while the need for an immediate solution is a sizable, the lifecycle of technology isn’t. A two-year wait time can create one of two problems: Either your newly developed solution is nearly outdated at launch, or you become caught in an unending cycle of redesign in an attempt to get ahead of a rapidly progressing technological landscape. Meanwhile, with the wide adoption of cloud-based SaaS model, speed of integration and deployment for third-party solutions has never been faster.


Scammers demand Bitcoin in DDoS extortion scheme, deliver empty threats

This week, the FBI says they’ve investigated hundreds of these cases, including several in Indiana – home to several major companies, the Indy 500, and this reporter. However, there has been no indication of attacks. When the targeted organization fails to meet the deadline or refuses to pay, those responsible for the demands fade into the background and the promised DDoS never happens. So, while the extortion attempts are turning out to be empty threats for now, that wasn’t always the case. In fact, it’s likely the people responsible for the most recent threats are using the ‘Anonymous’ and ‘Lizard Squad’ brands because they’ve been associated with DDoS attacks in the past. Most administrators will remember the panic that swept through enterprise and SMB channels when Anonymous was using DDoS as their primary means of protest in 2010, something they still do to this day.


A coding error led to $30 million in ethereum being stolen

The perils of a blockchain’s immutable transactions was brought home yesterday as some $30 million in ether was stolen due to a bug in the code of a well known ethereum wallet. It could have been worse: an additional $75 million was at risk because of the same coding fault, but a group of vigilante hackers rescued those funds and are promising to give them back to their owners. The ether was grabbed from the wallets of at least three projects that had recently completed so-called “initial coin offerings” (ICOs). More worryingly for ICO boosters, the vigilante hackers—who call themselves “The White Hat Group“—saved funds from wallets belonging to some of the biggest coin offerings to date. The bug has now been fixed. Those wallets required multiple people to sign off on transactions, which were supposed to make them more secure.


The 3 most in-demand cybersecurity jobs of 2017

"For lower-level professionals, companies need to consider if they want to pay a premium for an analyst to get every skillset they're looking for, or if they want to invest in trainings and seminars," Zafarino said. If you chose the latter, it's key to bring in a consultant for a short amount of time to help get the employee up to speed. "In the long term, that person is probably perfect, especially if you don't have the money at hand," he said. "If you do, you absolutely want to go with the more senior resource, and you can bring in lower-level people along the way." Zafarino said he commonly sees two paths to becoming a cybersecurity professional. In the first, a person comes from a computer science background, and can usually command a higher salary.


Bank workloads to be taken over by machines

Cognitive technologies, or machines that perform human tasks – have become cheap enough for banks to deploy them throughout their organisation. McKinsey said that automating tasks will “free up capacity” for staff to focus on higher-value work, such as research, generating new ideas or tending to clients. “This is really starting to take steam and it’s going to transform the industry over the next two to three years,” Jared Moon, a McKinsey partner who co-wrote the report, said in an interview. These cognitive technologies are estimated to free up 20 to 30% of employees’ capacity in units processing trades. Automation has not unanimously been welcomed with open arms. Workers worry they will be replaced by machines that can do their job for them, at a fraction of the cost.that can However, this won’t be the reality.


Data Mining - What, Why, When

The broad benefit of identifying hidden patterns, consequent relationships and establishing predictive models can be applied to many functions and contexts in organizations. Specifically, customer-focused functions can mine customer data to acquire new customers, retain customers, cross-sell to existing customers. Other examples are to enhance customer lead conversion rates and/or build future sales prediction models or new products & services.  Financial sector companies can build fraud-detection models and risk mitigation models. Energy and manufacturing sector can come up with proactive maintenance models and quality detection models. Retailers can build stock placement/replenishment models in stores and assess the effectiveness of promotions and coupons. Pharmaceutical companies can mine large chemical compounds data sets to identify agents for the treatment of diseases.


COBIT 5 for Risk—A Powerful Tool for Risk Management

One would think that, IT being critical to an organization’s operations, the risk related to IT and IT security would be covered by many different risk management frameworks, including the Committee of Sponsoring Organizations of the Treadway Commission (COSO) for enterprise risk management (ERM), the Risk Management Society’s RIMS Risk Maturity Model (RMM), Project Management Institute’s (PMI) Project Risk Management, International Organization for Standardization (ISO) / International Electrotechnical Commission (IEC) 27005 Information technology—Security techniques ... Arguably, there is only one globally accepted and in-use business framework to employ when it comes to risk management in the IT domain and, specifically, the governance and management of enterprise IT. That framework is COBIT 5.


How to monitor MongoDB database performance

In a smoothly running set of primary and secondary nodes (referred to as a “replica set”), the secondaries quickly copy changes on the primary, replicating each group of operations from the oplog as fast as they occur (or as close as possible). The goal is to keep replication lag close to zero. Data reads from any node should be consistent. If the elected primary node goes down or becomes otherwise unavailable, a secondary can take over the primary role without impacting the accuracy of data to clients. The replicated data should be consistent with the primary data before the primary went down. Replication lag is the reason that primary and secondary nodes get out of sync. If a secondary node is elected primary, and replication lag is high, then the secondary’s version of the data can be out of date.


7 Things Your IT Disaster Recovery Plan Should Cover

“Completing a BIA for major IT systems will allow for the identification of system priorities and dependencies,” notes Testoni. “This facilitates prioritizing the systems and contributes to the development of recovery strategies and priorities for minimizing loss. The BIA examines three security objectives: confidentiality, integrity, and availability.” Testoni adds that a BIA helps establish priorities for your disaster recovery, business continuity, and/or continuity of operations plans. “A standard approach to developing a comprehensive disaster recovery plan is to first develop the policy, then conduct the BIA,” he says. “After creating a prioritization with the BIA, contingency strategies are developed and formalized in a contingency plan.”


Android O: The Reddit AMA's 8 most interesting reveals

Google teased us with dark mode on both the Android N and O developer previews, but it’s not making it into the full release anytime soon. The reason? “Reliable and consistent theming is hard.” Numerous questions about themes and dark mode stacked up on the Reddit board, and Android engineer Alan Viverette addressed it thusly: “There are technical and logistical issues with theming. The technical side is largely solved in O with Runtime Resource Overlay support (a Sony framework that allows the system to modify the look and feel of an app while it is running); however, we still don’t have stable APIs for describing what can be themed or adequate ways to verify that existing applications properly support theming.”



Quote for the day:


"It's the little details that are vital. Little things make big things happen." -- John Wooden


October 17, 2016

Tech Bytes - Daily Digest: October 17, 2016

How to hire your employer, Bringing security back to the top of the board room agenda, Don't get burned by data center hot spots, Learn actionable insights & practical guidance from COBIT, Threat response automation: The next frontier for cybersecurity and more.

Evolving DCIM market shows automation, convergence top IT's wish list

IT also needs to do more with less. Data volumes double every few years, but IT budgets are increasing at low, single-digit rates. As a result, data center managers are having trouble keeping up with the volumes of information. Consequently, users want DCIM products to be more than just monitoring tools; they want to weave them into the data center tapestry. Combining a DCIM tool with change management software creates new automation possibilities. For instance, a company could automatically generate a work order, which indicates the rack and position where an add-on device can be installed, specifies the devices and ports that will be connected -- such as power, LAN and cables -- and links that information to relevant applications.


How to hire your employer

When we find ourselves stuck in unhappy careers—and even unhappy lives—it is often the result of a fundamental misunderstanding of what really motivates us. As we discussed in our book How Will You Measure Your Life, just because you’re not dissatisfied with your career path, doesn’t mean you’re satisfied with it. The things that you might easily put on your resume or talk about at a cocktail party, such as your job title or how big your office is, are not what really motivates most people in the long run. Instead, we’re driven by what we call “intrinsic’’ factors. They’re more difficult to see when you’re sizing up a job opportunity, but extremely important. Instead of simply asking about the perks and benefits of a new job, try asking yourself


Bringing security back to the top of the boardroom agenda

Security needs to be part of the design from the start and not bolted on afterwards. Too often security and compliance are an afterthought, once solutions have already been built and the projects have started. Security needs to be part of the foundations of IT. Building it into the core platform throughout your business allows for much faster transactions to market, as fewer things need to be altered when moving from development, to testing and finally to production. Having a software-defined architecture for security, built into the fabric of the IT infrastructure from the data centre to the device, is needed to embrace security in every phase of IT from the outset.


How to Design the Optimal Business Intelligence Dashboard

Unclear goals can dampen the impact of any IT project, and BI implementation is no exception. You need to consider your departmental goals and how they relate to broader business goals, and keep these goals in mind when designing your dashboards. Ask the bigger questions - How will these dashboards help achieve goals? What sort of metrics should we display that will improve our sales/costs/efficiency/customer satisfaction? IT cannot build a BI platform based on what they feel users will want, they need input from the actual user base. For some companies, the challenge comes on the back end, in terms of the technical troubles with integrating multiple disconnected data sources into the BI solution. They might have the right dashboard in place and know what metrics they want to examine, but the flow of data simply isn’t there.


Don't get burned by data center hot spots

Some computer room air conditioning units have insufficient knowledge of how air really moves in a data center, causing even worse cooling conditions. In modern designs, redundant units run simultaneously with normal units, but at reduced speed, so you don't realize added servers are stealing redundant capacity until a cooling unit fails or is turned off for maintenance. Thankfully, servers can tolerate a higher operating temperature for several days with little negative effect. ASHRAE's allowable thermal envelope goes up to 32 degrees Celsius or 89.6 degrees Fahrenheit in emergencies, but marginal redundancy -- combined with poorly planned computing hardware additions -- can cause serious overheating and thermal shutdowns within a short time after a cooling unit has quit.


Slack CEO describes 'Holy Grail' of virtual assistants

You might scour your email or document-management systems, using such search terms as "term sheet," and pull up a handful of emails or files. Once you find the dates you might go to separate financial reporting tool to look up the revenue information. Such a process could take you as much as 45 minutes. Now imagine a tool -- a bot network operating as one if you will -- that could find the information in disparate apps, cross-reference it and generate the correct answer in seconds. Butterfield estimates that such a system would result in productivity gains of anywhere from 10 percent to 30 percent. “That is the knowledge worker equivalent of giving a ditch digger a backhoe instead of a shovel," Butterfield says. "I would love it if we were successful building something like that," Butterfield says.


Learn Actionable Insights & practical guidance from COBIT

COBIT can be complex or simple, depending on the perspective from which it is read, understood and implemented. COBIT philosophy can complement and supplement a professional’s practical experience. However, fundamental understanding of core principles and philosophy of COBIT makes it easier to understand and implement. COBIT is easy to implement if one understands the rationale of design of COBIT. This will help in de-mystifying the structure and enable users to navigate and select relevant contents of COBIT knowledge repository from practical perspective of governance, assurance, risk and compliance as required from macro or micro perspective. The best way to enhance COBIT expertise is to implement it in real-life situations and scenarios.


Threat Response Automation: The Next Frontier for Cybersecurity

Roughly speaking, we could divide cybersecurity software evolution into two waves. The first wave was dominated by rule-based deterministic solutions. A classic example is the firewall. Firewalls apply simple policies, such as blocking inbound traffic, ports or protocols. The second wave of solutions consists of “fuzzy” rules and heuristics. We could perhaps mark the beginning of this wave of solutions with the first Intrusion Detection System (IDS). These solutions employed ML algorithms to spot anomalies and detect malicious activity. In fact, most contemporary cybersecurity vendors take pride in how their solutions utilize ML. Fraud analytics, web gateways, endpoint protection solutions and network sniffers, all utilize ML in their offerings.


Cut to the Chase: How a Data-Driven Culture Fosters Success

“About a year ago, we got the opportunity to use the Domo platform,” he said. At first he just gave licenses to his growth leaders around the country. “Then I decided that maybe I should dig deeper into this, which was one of the best things I could have done.” That’s when his conversations with national teams took a sharp turn, and for the better. “It allowed me to cut through a lot of the data, and cut through to the information that would really help me manage the group. Domo actually allows me to get a view into those offices like I never had before.” The end result, he said, was a significant transformation in how quickly and effectively he and his team could identify new opportunities, and solve otherwise challenging client issues.


Don’t fall behind when it comes to migrating to the cloud

Security is also a strong benefit of cloud storage. While many assume that opening up a company’s database to online storage may run a higher risk of security breaches, in fact the opposite is often true. Because of their large scale and intensive client security requirements, cloud hosting providers often have better security than is reasonably maintained in-house by small and medium size businesses. Off-site backups, 24/7 monitoring, and enterprise-grade security audits are typically out of the price range of smaller organizations. It’s also important to note that not every application is right for the cloud. While migrating an internal communications tool, like a social intranet makes practical sense for the cloud, highly regulated and sensitive data like credit card information or health care records may not be suitable.



Quote for the day:


"Liberty is always dangerous, but it is the safest thing we have." -- Harry Emerson Fosdick


September 04, 2015

A degree in data science is in demand

The work of a data scientist is really two-fold. First, the data scientist must pull together all this data, which is often just a collection of garbled text or numbers, and clean it up to the point where it can be analyzed. Then, the data scientist has to know how to extract meaningful information from the cleaned-up data. “Big data represents one of the fastest growing areas of business, estimated to become a 17-trillion-dollar industry by 2020," wrote Becker College when it introduced its new data science program earlier this year.  Locally, Worcester Polytechnic Institute and Becker offer data science programs; both convinced that data science is already a desired career path for their students.
WPI's data science program is entering its second year; it currently offers a two-year, graduate-level degree in data science, and this fall, is adding a doctorate-level degree.


US Army’s Cyber War Strategy is Not Just for Military Use

Taking threat sensor data, removing noise and analyzing the data, will provide decision makers with the ability to forecast, gain up-to-date battle damage assessments (BDA) and supply geolocation information of the enemy and the electronic signatures our own forces generate. Convergence is going to be achieved by consolidating its cyber forces operating across multiple departments into single cross operational units removing impediments to information sharing. By fiscal year 2017, the U.S. Army Cyber Command (ARCYBER) will eventually have 41 Cyber Mission Forces operationally capable. They will combine cybersecurity, electronic warfare and signal doctrine into single units. The units will use past lessons learned to develop new doctrines in cyber security.


How Edge Data Center Providers are Changing the Internet’s Geography

Ultimately, location is the main way for companies like EdgeConneX to differentiate from the big colo players like Equinix or Interxion. Edge data center providers are essentially building in tier-two markets what Equinix and its rivals have built in the big core markets: hubs where all the players in the long chain of delivering content or services to customers interconnect and exchange traffic. These hubs are where most of the internet has lived and grown for the bulk of its existence, and edge data center companies are building smaller hubs in places that don’t already have them but are becoming increasingly bandwidth-hungry.


What Do Marketers Really Want in Data and Technology?

You may have heard of Data-as-a-Service (DaaS). Companies are touting DaaS as the next big thing and as a solution that gives marketers an “unfair competitive advantage.” By linking data with technology, DaaS is completely changing the game through a new model of fast-moving and real-time data acquisition. As the name implies, Data-as-a-Service begins with the data. Specifically, a company’s internal data, third party data, real-time fast data, and unique and hard-to-find data (HTFD) sourced from the Big Data ecosystem. With technology this data is structured to create insight into their best customers and ideal prospects. Real-time knowledge is also used to learn about who is actively in market for products and services, who is searching for competitors, or who is posting to social media for product recommendations.


Leveraging COBIT to Implement Information Security (Part 3)

In the context discussed here, it is envisaged that controls within the system are selected by management on a risk-assessed basis to address the perceived threats to the security of the organisation’s core business processes. Once selected, the ISMS is the basis for collecting evidence for operation and reviewing the efficacy of the implementation on an ongoing basis as part of the security forum. The forum is created by senior management, typically the chief executive officer (CEO), as a collaborative round table where managers from IT security, IT, human resources (HR) and major business functions can come together to make decisions on the basis of regular reporting from the system.


Disruptive tech and its impact on wireless protocols and networks

Internet of Things is not a new concept. It's been around for a long time. We used to call it telemetry or sensor-based computing. But the idea that we can do it today at a very low cost and that we can automate so many applications -- medical applications, security, energy management, all kinds of things like that -- means that there's going to be more and more happening on the network over time. And many of those applications will be mobile. (Not everything in IoT is mobile, but a lot of it will be.) So planning for that in terms of capacity, [security and cost is] made more complex. So, even though mobility opens up a lot of opportunities, it does come with a set of costs that we didn't have before.


Indoor positioning – Are we nearly there yet?

If the object you are locating and tracking happens to have a device with some unique identifier attached to it, like a tag or smart phone, things become significantly easier. Now you can have many fixed transmitters sending out pulses, getting received by the device that can then send out a “reply” rather than the reflected pulse that can also contain its unique identifier. The transmitters can be simple and omnidirectional, but then you need a few of them (remember each one defines a circle; in the plane, i.e., in 2D, at least 3 transmitters are needed to determine a unique position) – the determination of a location from measuring distances to a few fixed points is known as Trilateration (check out Multilateration while you’re at it).


Don’t Let Cyberattacks Take A ‘Byte’ Out Of Your Bottom Line

Should a data breach occur, having an incident response plan in place can help ease the pressure in the heat of the moment. Affected systems should immediately be closed off from the remainder of the company’s infrastructure in order to pinpoint the root cause. When a data breach does occur, use it as a learning experience, extracting as much information as possible about how and why the incident occurred. That information can then be used to strengthen IT infrastructure by plugging holes and establishing improved monitoring programs to detect threats. Reaction plans should be tested and updated regularly to ensure any future threat responses are as effective and efficient as possible.



Why Optimization and WANOP for Your Cloud Is Now Easier than Ever

We’re now pushing down rich content, a variety of applications, and a lot of new use cases. The reality here is that cloud will continue to grow as more users and verticals adopt this very versatile platform. In fact, global spending on IaaS is expected to reach almost $16.5 billion in 2015, an increase of 32.8 percent from 2014, with a compound annual growth rate (CAGR) from 2014 to 2019 forecast at 29.1 percent, according to Gartner. The report goes on to state that over time, as a business becomes more comfortable with the use of IaaS, organizations, especially in the midmarket, will eventually migrate away from running their own data centers in favor of relying primarily on infrastructure in the cloud.


Resiliency Testing Best Practices - Report

Every organization must put a plan in place for recover-ability after an outage, but testing your enterprise resilience without full business and IT validation is ineffective. Read the white paper to learn how to put a plan in place for full functional validation, and get details on the importance of validating resiliency in a live environment; learn why small-scale recovery “simulations” are inadequate and misleading; understand why validating resilience demands involvement from IT and the business; and get details on the checks and balances you need to maintain and validate business resilience.



Quote for the day:

"Let a man lose everything else in the world but his enthusiasm and he will come through again to success." -- H. W. Arnold

April 28, 2015

How Data Center Virtualization Shrinks Physical Distance
There’s no doubt that optimization technologies are going to continue to evolve. One of the key technologies making the data center virtualization push is of course software-defined networking. We can do so much more with a physical switch now than we ever could before. We even have network virtualization and the ability to quickly create thousands of vNICs from physical devices. The ability to dynamically create LANs, vLANs, and other types of connectivity points has become easier with more advanced networking appliances. This goes far beyond just optimizing the links between data center environments.


Applying COBIT in a Government Organization
The comprehensive nature of COBIT 5, which combines several areas, including IT risk, information security and governance, is one of its major benefits. In addition, the enablers concept presents a unique view of how and where to pose some questions when adopting and enhancing the framework. To facilitate the transition, the audit function presented to the management staff a simplified model, listing the COBIT 5 processes and asking for the perceived degree of relevance and corporate knowledge of each process. These answers were compared with the maturity observed through audit and internal control actions, making it possible to devise a matrix of priorities for the processes to be analyzed in subsequent audits, which strengthened the support for management decisions through the adoption of the framework


Smartphone Secrets May Be Better Than a Password
The team used an algorithm to find suitably infrequent events to use as the basis for questions. On average, users succeeded in answering three questions about themselves correctly 95 percent of the time, and they were able to answer questions about other people less than 6 percent of the time. Now, Roy Choudhury says, the researchers are speaking with companies like Yahoo and Intel to figure out if what they’re doing could be useful for enterprise users and, if so, what needs to be done to make the system work well. One issue would be figuring out what kinds of activity data users would be comfortable sharing. Another is how such a system would work if you haven’t used your phone recently or can’t remember who texted you last night at 8:05.


IoT And The Looming Mobile Tidal Wave
"In fact, I consider terming it an 'Internet' as a bit of misnomer, because it largely consists of wireless-connected, non-phone mobile devices interacting in a client-server, or hub-and-spoke model. The Internet analogy does not, and should not, apply for most real-world applications coming online today." According to Brisbourne, "The level of interconnectivity among devices that's needed for these applications is actually pretty low, as they tend to use dedicated point-to-point communication, and point-to-point service delivery. For example, an irrigation system that responds to physical weather conditions and decides, singularly, when to switch on a sprinkler system. The IoT requires a much simpler mobile architecture as the environment is quite closed, generally capable of flowing a particular type of data in one direction. It is not an extension of the Web into the life of devices."


Clorox CIO discusses the real challenge of big data
When you start from the business use case, Singh adds, infrastructure questions become much easier to answer. "One of the best examples is looking at your volume shipment data and connecting it to certain initiatives you have in the business, like sales," he says. For instance, you may want to measure the effect of a promotion effort. But maybe there was a snowstorm in the region during the period you're evaluating. If the promotion didn't meet expectations, was that due to some quality of the promotion, or was the weather to blame? You need to bring in weather data for that, but you don't need to know what the global weather was in that period, even if you have access to the data.


CIA CIO Doug Wolfe on Commercial Cloud Services (C2S) Lessons Learned and Road Ahead
C2S enables more reliable and functional delivery of services to end-users. One of the biggest benefits to date has been in delivering reliable and functional services to end users and doing it faster because developers have common and known and easy to work with environments. In most cases end users will not know C2S is delivering this capability. They just see more and better functionality. One category of functionality, for example, is in geospatial applications. Working with both our own and NGA’s technical teams we are leveraging C2S to deliver enhanced geospatial analysis tools and end users do not need to be troubled to know where the compute power for those come from.


EU data protection regulation will drive privacy by design, says KuppingerCole
Kinast believes that privacy by design will have a positive impact on business continuity. Although the regulation tends to be seen in a negative light because businesses foresee they will have to put more effort into designing their software and services, he said that after a while, companies will realise that this approach will lead to better business continuity. “Privacy by design will help companies realise that they need more identity and access management as well as an appropriate security strategy,” said Kinast. Many organisations do not have proper access controls, he said, to ensure that employees can access only the software, systems and data that they need to do their jobs.


Report: Internet of Things to Spur Data Center Demand Explosion
“Equal, or even greater, investments in the IoT platform services residing in the data center will be instrumental in delivering the IoT promise of anytime, anywhere, anyhow connectivity and context,” Rick Villars, vice president of data center and cloud divisions at IDC, said in a statement. “Given the number of devices connected and the amount of data generated, businesses must focus on their IoT service platform requirements at the level of the data center itself, not just the individual servers or storage devices.” The analysts believe IoT will be the single largest driver of IT expansion in larger data centers. Because agility and scale are crucial to IoT applications, that expansion will take place primarily in service-provider data centers rather than on-premise corporate IT facilities.


Captive IT centre boosts competitiveness at Danske Bank in Denmark
“Many companies have started to favour captive centres again because they were cutting too deep into their own knowledge base. They actually outsourced too much of their business knowledge and core knowledge. Now they like to have the essential understanding and competencies internally,” said Henrik Ringgaard, managing consultant at PA Consulting. “They want to make sure they have the right knowledge base and that all business units really have the needed business understanding of their company.” While Ringgaard agreed that captive centres are gaining popularity among very large companies, he said offshoring in the whole is a growing trend and is maturing particularly in the Nordics.


Kong goes open source: Mashape dubs it the first microservices management layer
"What we're open-sourcing is the back end and the core technology of Mashape. It's a management layer, a centralised dispatcher for microservices and APIs - and it's built on top of nginx, so we're using nginx internally to proxy HTTP APIs," Mashape CEO and co-founder Augusto Marietti said. "On top of that we've built other layers - the infrastructure to manage, monitor, log, secure, authenticate, do transformations - on top of all the APIs." In microservices architectures, applications are built as a suite of small, semi-autonomous processes that communicate with each other through APIs and perform specific tasks. Designed to be easy to use and scalable, microservices are increasingly figuring in web, mobile and internet-of-things apps.



Quote for the day:

“Those who dare to fail miserably can achieve greatly.” -- Robert F. Kennedy

January 13, 2015

Using COBIT 5 to Deliver Information and Data Governance
Part of doing this successfully involves ensuring the availability of reliable and useful information for decision making. This clearly involves keeping the ratio of erroneous or unavailable information to a minimum. Limiting erroneous decision making also involves ensuring that reporting is complete, timely and accurate.2 Measuring performance here involves looking at the percent of reports that are not delivered on time and the percent of reports containing inaccuracies. These obviously need to be kept to a minimum. Clearly, this function is enabled by backup systems, applications, data and documentation. These should be worked according to a defined schedule that meets business requirements.


Computers may soon know you better than your spouse
To judge the effectiveness of the computer algorithms, researchers gave questionnaires to friends and relatives of some participants. The survey results and computerized assessments were then compared with the self-assessments from the subjects. With just 10 likes, the computer would know someone as well as a work colleague. With more than 70, it would get to the level of a friend or roommate, and with more than 300 to the level of a spouse or close relative. The study is notable because of its large sample size, said Jennifer Golbeck a computer scientist at the University of Maryland, College Park and the director of the University of Maryland Human-Computer Interaction Lab.


New Form of Memory Could Advance Brain-Inspired Computers
Phase-change memory is expected to hit the market in the next few years. It can write information more quickly, and pack it more densely, than the memory used in computers today (see “A Preview of Future Disk Drives”). A phase-change memory chip consists of a grid of “cells” that can each switch between two states to represent a digital bit of information—a 1 or a 0. In IBM’s experimental system, each “synapse” is represented by a pair of memory cells working together. Computer scientists have been working for some time on chips that crudely mimic neurons and synapses. Such “neuromorphic” designs are radically different from the chips we use today.


5 ways to give IT recognition
We all like to know that our efforts are appreciated. For people working in IT, recognition is too often neglected, simply because so much of what IT workers do is behind the scenes and goes unnoticed by the majority of employees. Click through to see five things that Paul Ingevaldson, author of The 9 ½ Secrets of a Great IT Organization, did when he was the CIO at Ace Hardware that cost little to nothing and that you can implement today.


Google Launches Cloud Application Performance Tool
Google Cloud Trace can perform a sort of "replay" analysis of a process stream to identify which users experienced slow request response times and then compose a report that identifies where the time is being spent in the system. Some slowdowns affect only a handful of users but nevertheless produce urgent complaints. Developers often have trouble identifying what's different about the response they obtained from the application versus other users. Cloud Trace is intended to speed the process up. Cloud Trace can break the steps of a single request down into the number of milliseconds that each part takes, pinpointing for developers the likely location of the slowdown.


Samsung, SmartThings and the open door to the smart home (Q&A)
In effect, Samsung is readying for the Internet of Things (IoT), the term for the concept of using sensors and other technologies to hook just about anything you can think of into the Internet. Analyst firm Gartner predicts the number of networked devices will surge to 26 billion units by 2020 from about 900 million in 2009, turning formerly "dumb" objects into smart ones that can communicate with each other. IDC reckons the IoT market will hit $3.04 trillion that same year. Samsung acquired smart-home startup SmartThings in August to help with its push. SmartThings' technology helps consumers to control their appliances with their smartphones, smartwatches and other devices, and SmartThings has been viewed as key to Samsung's smart-home and Internet of Things efforts.


DiversityMediocrityIllusion
When interviewing, we make a point of ensuring there are women involved. This gives women candidates someone to relate to, and someone to ask questions which are often difficult to ask men. It's also vital to have women interview men, since we've found that women often spot problematic behaviors that men miss as we just don't have the experiences of subtle discriminations. Getting a diverse group of people inside the company isn't just a matter of recruiting, it also means paying a lot of attention to the environment we have, to try to ensure we don't have the same Alienating Atmosphere that much of the industry exhibits. One argument I've heard against this approach is that if everyone did this, then we would run out of pink, sparkly marbles.


EU countries that set data retention rules must ensure they comply with e-Privacy Directive
In its opinion, the European Parliament's Legal Services unit said EU countries, since the CJEU's judgment, have had the option of either repealing their own laws on data retention or maintaining them. However, it said that should countries choose to maintain the rules then those rules must adhere to the e-Privacy Directive. ... The e-Privacy Directive sets out rules that generally protect the privacy of electronic communications and data associated with those messages, 'traffic data'. One specific provision places a general prohibition on the unauthorised storage of communications and traffic data.


The Future of Scaling and Strategy
One way that scaling strategies work is by distributing products and services through existing platforms. An existing network or platform may be able to replicate a product or service. This is especially helpful for non-profit programs who are already limited in regard to resources, but want to reach as many of those who would benefit from the program as possible. A small non-profit may be able to piggyback on an existing network, especially with the availability of cloud computing to get their message to a wider audience than they could otherwise.


Data Acceleration: Turning Technology Into Solutions
The landscape of solutions that foster data acceleration and enable a successful data supply chain has grown more complex than ever. Executives need to fully understand the technology components available on the market, because each supports data acceleration in unique ways. They also need to recognize that these components deliver maximum value only when they are combined in ways that capitalize on their complementary advantages. Only then can they decide which configurations may be best for their organization’s needs and discuss prospective solutions with vendors – and ultimately achieve returns from their analytics and big data investment.



Quote for the day:

"A good general not only sees the way to victory; he also knows when victory is impossible." --Polybius

December 04, 2014

Juniper Unbundles Switch Hardware, Software
Combining Junos with OCP hardware removes the burden of support, installation and maintenance from cloud providers and places it squarely on the vendor – in this case, Juniper. Juniper has not yet announced pricing for the OCX1100 hardware, but customers buying in large volumes will “be pleased,” says Jonathan Davidson, senior vice president and general manager for Juniper’s Security, Switching and Solutions Business Unit. Smaller-volume purchases will be priced comparably to Juniper’s internally designed top-of-rack switches, Davidson says.


Colo Business Thrives as Enterprises Move to Cloud
“The relative spend on (and prospects for) colocation, enterprise data centers, and cloud are all intertwined,” said Dinsdale. “Clearly enterprises are pushing more and more IT workloads onto the cloud, which diminishes their potential spend on their own data centers. Colocation is in an interesting middle ground. The growth of cloud is a big driver for colocation growth while trends in the enterprise are inhibiting growth in enterprise spend on colocation.” Most of the spend on retail colocation doesn’t come directly from enterprises, but from various types of service providers such as cloud, IT, telcos, and content providers.


The Power of Transformational Feedback – Entering the ZOUD
It might sound like something out of science-fiction, but the ZOUD – the “zone of uncomfortable debate” is a pithy phrase first coined by Professor Cliff Bowman as part of his research at Cranfield School of Management into the nature of high performing teams. It describes the area of creative tension that exists in any conversation that is more than a social chat and which needs to be penetrated if we’re going to be able to deliver the message we need to get across. For most of us entering the ZOUD does not come naturally since we have learnt the skills of comfortable debate and we have learnt to prize rapport highly in our everyday relationships.


6 Things Slowing Down Big Data
Organizations also gather data in bulk from other sources, namely sensor networks, remote sensing via satellites, vehicle diagnostic data and point-of-sale terminals. This trend of automated data collection has the potential to drive a radical transformation in how enterprises research, innovate, market and ultimately grow. While one would think this glut of information collected by machines would be a boon for enterprise users of Big Data, it has become apparent that it is a victim of its own success. While Big Data can be very useful to an organization, there are six issues that currently hinder the progress of the field.


COBIT 5 Advantages for Small Enterprises
The process of implementing this principle—and the other core COBIT principles—can be managed as simply or with as much detail as the enterprise deems appropriate. It is sensible to ensure that COBIT is properly consumed and understood, of course, but even taking a basic approach is likely to provide the organisation with tangible benefits when properly considered. COBIT 5 Implementation provides a good high-level overview of the principles and how they relate to the life cycle. It also provides a more granular description of how these principles can be applied in practice.


Operational Intelligence: The Next-Generation of Business Intelligence
With the emergence of the Internet of Things and the demand for greater customer personalization, companies are increasingly striving to quickly make sense of their data as it changes. Operational intelligence – the ability to analyze live, fast-changing data and provide immediate feedback – takes business intelligence to the next level and creates amazing new opportunities. Using in-memory computing technology allows live, fast-changing data to be stored, updated and analyzed continuously. Ever changing data streams enriched with historical data and then analyzed in parallel provide powerful feedback on the fly. The benefits of operational intelligence are far-reaching and applicable to a wide range of industries, Tincluding manufacturing, cable, and retail.


How Global Enterprises are Grappling with New Data Protection Demands
Yes, disruptions in protection will continue to limit product and service development. Yes, downtime will continue to take a bite out of revenue. And yes, incremental business opportunities, customer acquisition and repeat business will continue to be affected by the way we protect our data. But the bigger issue – the one that global enterprises of all sizes will really want to pay attention to – is how data protection will affect new business opportunities and revenue streams going forward. That’s why we’ll likely see these types of business consequences, along with a loss in market value, move to the top of the disruption list.


Around the World With BYOD
BYOD is stalling in Europe, too, according to an IDC Europe report earlier this year. One reason is that employees simply expect the company to provide a mobile device for work. "There's a cultural expectation here that your employer will provide you with the tools you need to do your job," says John Delaney, associate vice president of mobility at IDC. "You don't expect to have to buy it yourself." In Brazil, workplace regulations require corporations to provide all required technology to employees, according to the Dell study. In terms of IT maturity, many Brazilian companies lack the infrastructure and security requirements to easily integrate BYOD. As a result, BYOD hasn't taken off there.


Microsoft's microservices vision for Azure starts taking shape
Vanhoutte also said that the new BizTalk Micro Services platform will be available through Microsoft's Azure Pack, which will allow customers to run the service "in the cloud of their choice." According to another attendee, @phidiax, a platform preview of Azure BizTalk Microservices is due in the first quarter of 2015. "BizTalk Micro Services will all run in their own scalable container (similar to Azure web sites) and that the communication engine seems to be following the lightweight HTTP approach," Vanhoutte blogged.


NexGen Cloud: Pressure To Adopt Software-defined Data Center Tech Mounting
"There are tremendous pressures causing data center administrators to consolidate," he said. "Now they're talking about looking at new ways to do it." Some industries such as networking and storage are still basically stuck in the past, which is unlike server virtualization which has exploded since 2005 to offer customers flexibility, efficiency and the ability to decouple apps from the server hardware, Elliot said. "Now we're seeing similar trend lines across networking and storage ... When you think about networking and storage industries, these businesses are ripe for disruption," he said.



Quote for the day:

"An army of principles can penetrate where an army of soldiers cannot." -- Thomas Paine