Showing posts with label gamification. Show all posts
Showing posts with label gamification. Show all posts

Daily Tech Digest - May 07, 2024

How generative AI is redefining data analytics 

When applied to analytics, generative AI:Streamlines the foundational data stages of ELT: Predictive algorithms are applied to optimize data extraction, intelligently organize data during loading, and transform data with automated schema recognition and normalization techniques. Accelerates data preparation through enrichment and data quality: AI algorithms predict and fill in missing values, identify and integrate external data sources to enrich the data set, while advanced pattern recognition and anomaly detection ensure data accuracy and consistency. Enhances analysis of data, such as geospatial and autoML: Mapping and spatial analysis through AI-generated models enable accurate interpretation of geographical data, while automated selection, tuning, and validation of machine learning models increase the efficiency and accuracy of predictive analytics. Elevates the final stage of analytics, reporting: Custom, generative AI-powered applications provide interactive data visualizations and analytics tailored to specific business needs. 


Open-source or closed-source AI? Data quality and adaptability matter more

Licensing and usage terms of services matter in that they dictate how you use a particular model — and even what you use it for. Even so, getting caught up in the closed vs. open zealotry is shortsighted at a time when 70% of CEOs surveyed expect gen AI to significantly alter the way their companies create, deliver and capture value over the next three years, according to PwC. Rather, you should focus on the quality of your data. After all, data will be your competitive differentiator — not the model. ... Experimenting with different model types and sizes to suit your use cases is a critical part of the trial-and-error process. Right-sizing, or deploying the most appropriate model sizes for your business, is more crucial. Do you require a broad, boil-the-ocean approach that spans as much data as possible to build a digital assistant with encyclopedic knowledge? A large LLM cultivating hundreds of billions of data points may work well. ... Of course, the gen AI model landscape is ever evolving. Future models will look and function differently than those of today. Regardless of your choices, with the right partner you can turn your data ocean into a wellspring of insights.


Tips for Building a Platform Engineering Discipline That Lasts

A great platform engineer is defined both by their ability to create infrastructure and advocate for and guide others (which is where communication skills come in) — especially in the platforms that are maturing today. As far as hard skills go, the platform engineer should have experience in cloud platforms, CI/CD, IaC, security, and automation. Other roles you’ll need include a product owner to manage platform stakeholders and track KPIs. Our 2024 State of DevOps report found that 70% of respondents said a product manager was important to the platform team – 52% of whom called the role “critical”. To avoid complexity and scaling issues, you’ll also need architects with the vision and skills to help the platform engineering team design and build it. Infrastructure as code (IaC) is version control for your infrastructure. It makes infrastructure human-readable, auditable, repeatable, scalable, and securable. IaC also lets disparate teams — developers, operations, and QA — review, collaborate, iterate, and maintain infrastructure code simultaneously. 


What Is the American Privacy Rights Act, and Who Supports It?

The APRA ostensibly is about data, but AI is also covered a bit. Companies must evaluate their “covered algorithms” before deploying it and provide that evaluation to the FTC and the public. Companies must also adhere to people’s request to opt out of the use of any algorithm related to housing, employment, education, health care, insurance, credit, or access to places of public accommodation. The APRA would be enforced by a new bureau operating under the Federal Trade Commission (FTC). State attorneys general would also be able to enforce the new law. It would also allow individuals to file private lawsuits against companies that violate the law. There are several important exceptions in the APRA. For instance, small businesses, defined as having less than $40 million in annual revenue or collecting data on 200,000 or fewer individuals (as long as they’re not in the data-selling business themselves), are exempt from the APRA’s requirements. Governmental agencies and organizations working for them are also exempt, in addition to non-profit organizations whose main purpose is fraud-fighting, as well. 


Empowering Users: Embracing Product-Centric Strategies in SaaS

A non-negotiable requirement for a SaaS product to succeed with a product-centric strategy is for it to be intuitively designed with minimal friction and a focus on delivering value as quickly as possible. This is not a set-and-forget task demanding a profound understanding of the critical user journey and ruthlessly prioritizing friction and pain point elimination instead of just plastering feature promotions through in- and out-of-product interventions. However, this cannot be done if teams don’t use data analytics or prioritize the voice of the customer through feedback loops to further product development and work towards building a loved and delightful product. A great example of a PLG pioneer is Figma.  ... On the other hand, adopting a product-led growth approach requires fundamental organizational shifts. The success of PLG requires a combined, multidisciplinary team dedicated to continuous improvement and adaptation of the product to support both new customer acquisition as well as retention and growth.


6 tips to implement security gamification effectively

Gamification leverages elements of traditional gaming, online and offline, to boost engagement and investment in the learning process. Points, badges, and leaderboards reward successful actions, fostering a sense of achievement and friendly competition. Engaging scenarios and challenges simulate real-world threats, allowing trainees to apply knowledge practically. Difficulty levels keep learners engaged, while immediate feedback on decisions solidifies learning and highlights areas for improvement. Effective implementation hinges on transparency, simplicity, and a level playing field. A central dashboard that displays the same security data for everyone keeps things simple, fostering a shared understanding of progress. ... Personalized challenges help ensure engagement. New security teams might focus on mastering foundational tasks like vulnerability scans, while seasoned teams tackle advanced challenges like reducing time for response to critical security events. This keeps everyone motivated and learning, while offering continuous improvement for the entire team.


Rethinking ‘Big Data’ — and the rift between business and data ops

Just as data scientists need to think more like businesspeople, so too must businesspeople think more like data scientists. This goes to the issue of occupational identity. Executives need to expand their professional identities to include data. Data professionals need to recognize that DI (changes in information) do not necessarily equate to DB (changes in behavior). Going forward data professionals are not just in the information/insight delivery business, they are in the “create insight that drives value creating behavior” business. The portfolio of tools available now have democratized the practice of data science. One no longer needs to be a math genius or coding phenom to extract value from data — see Becoming a Data Head: How to Think, Speak, and Understand Data Science, Statistics, and Machine Learning by Alex J. Gutman, Jordan Goldmeier. ... Executives need ready access to data professionals to guide their use of data power tools. Data professionals need to be embedded in the business rather than quarantined in specialized data gulags.


The Technical Product Owner

There is a risk that the technical Product Owner or product manager might no longer focus on the “why” but start interfering with the “how,” which is the Developers’ domain. Otherwise, a technical Product Owner might help the Developers understand the long-term business implications of technical decisions made today. ... A technical Product Owner would be highly beneficial when the product involves complex technical requirements or relies heavily on specific technologies. For example, in projects involving intricate software architecture or specialized domain knowledge, a technical Product Owner can provide valuable guidance, facilitate more informed decision-making, and effectively communicate with the Developers. This deep technical understanding can lead to better solutions, improved product quality, and increased customer satisfaction, especially in industries with critical technical expertise, such as software development or engineering. 


The digital transformation divide in Europe’s banking industry

Europe’s digital divide is a product of typical characteristics: internet connectivity, digital literacy, the availability of smartphones and digital devices. Disparities in broadband access in urban and rural communities remain stubbornly persistent. According to Eurostat, around 21% of rural households in the European Union do not have access to broadband internet, compared to only 2% of urban households. In Romania, which ranked lowest on the EU’s Digital Economy and Society Index in 2022, the market is dominated by incumbent banks. Only 69.1% of adults hold a bank account, pointing to low levels of financial literacy and inclusion – underpinned by a preference for a cash economy. In contrast, the UK has a rate of over 60% fintech adoption growth according to data from Tipalti, and Lithuania has established itself as an impressive fintech ecosystem backed by the nation’s central bank. However, it is too simplistic to reduce the digital divide to regional disparities, as the starker differences lie between countries themselves.


Why AMTD Is the Key to Stopping Zero-Day Attacks

AMTD technology uses polymorphism to create a randomized, dynamic runtime memory environment. Deployable on endpoints and servers, this polymorphism ability creates a prevention-focused solution that constantly moves system resources while leaving decoy traps in their place. What occurs next is that threats see these decoy resources where real ones should be and end up trapped. For users, it’s business as usual because as they don't notice any difference—system performance is unaffected while security teams gain a new layer of preventative telemetry. Today, more and more companies are turning to AMTD technologies to defeat zero days. In fact, industry analysts like Gartner suggest that AMTD technology is paving the way for a new era of cyber defense possibilities. That’s because instead of trying to detect zero-day compromise, these technologies prevent exploits from deploying in the first place. Against zero-day attacks, this is the only defensive approach organizations can rely on.



Quote for the day:

"Always remember, your focus determines your reality." -- George Lucas

Daily Tech Digest - February 03, 2024

NCA’s Plaggemier on Finding a Path to Data Privacy Compliance

On the international stage, companies are becoming more aware of the more active and robust policies they may face and the penalties they can carry. That has led to some patterns, Plaggemier says, developing around what is reasonable for companies to enact in relation to their sector and industry. “Do you have security or privacy tools or practices in place that are in line with your competitors?” she asks. While such an approach might be considered reasonable at first, competitors might be way ahead with much more mature programs, Plaggemier says, possibly making copying rivals no longer a reasonable approach and compelling companies to find other ways to achieve compliance. Data privacy regulations continue to gain momentum, and she believes it will be interesting to see what further kind of enforcement actions develop and how the courts in California, for example, manage. As CCPA and other state-level regulations continue into their sophomore eras, Plaggemier says at least a few more states seem likely to get on the bandwagon of data privacy regulation. Meanwhile, there is also some growing concern about how AI may play a role in potential abuses of data in the future.


What Is Enterprise Architecture? (And Why Should You Care About It)

Ideally, Enterprise Architecture supplies the context and insight to guide Solution Architecture. To address broad considerations, and align diverse stakeholder viewpoints, Enterprise Architecture often needs to be broader, less specific, and often less technical than Solution Architecture. ... Done well, Enterprise Architecture should provide long-term guidance on how different technology components support overall business objectives. It should not prescribe how technology is, or should be, implemented, but rather provide guardrails that help inform design decisions and prioritization. Additionally, most organizations have several technology components that support business operations; Salesforce is usually just one. Understanding how the various technology components work together will enable you to be a well-informed contributing member of a larger team. EA can help to provide valuable context about how Salesforce interacts with other systems and might spark ideas on how Salesforce specifically can be better utilized to support an organization.


AnyDesk says hackers breached its production servers, reset passwords

In a statement shared with BleepingComputer late Friday afternoon, AnyDesk says they first learned of the attack after detecting indications of an incident on their product servers. After conducting a security audit, they determined their systems were compromised and activated a response plan with the help of cybersecurity firm CrowdStrike. AnyDesk did not share details on whether data was stolen during the attack. However, BleepingComputer has learned that the threat actors stole source code and code signing certificates. The company also confirmed that the attack did not involve ransomware but didn't share too much information about the attack other than saying their servers were breached, with the advisory mainly focusing on how they responded to the attack. As part of their response, AnyDesk says they have revoked security-related certificates and remediated or replaced systems as necessary. They also reassured customers that AnyDesk was safe to use and that there was no evidence of end-user devices being affected by the incident. "We can confirm that the situation is under control and it is safe to use AnyDesk. 


The Ultimate 7-Step CEO Guide to Visionary Leadership

Unlike strategic objectives, which are rationally derived, visions are values-laden. They give meaning through an ideological goal. Since they are about what should be, they are, by definition, an expression of values and corporate identity. Thus, effective CEOs keep the vision malleable in relation to the business landscape but never change the values underneath. Not only that, but their personal values align with the organization and its vision — one reason for doing a values assessment in CEO succession. ... Some of the most catastrophic events in history have been the result of a psychopath's vision. Visions can be powerful, influential and morally corrupt — all at the same time. Conversely, real leaders create a vision that benefits the entire ecosystem, where the rising tide lifts all boats and makes the world a better place. Robert House, from the University of Pennsylvania, defined a greater good vision as "an unconscious motive to use social influence, or to satisfy the power need, in socially desirable ways, for the betterment of the collective rather than for personal self-interest." This is using the will to power for the betterment of humanity, to shape the future, rather than as a source of ruthless evildoing.


AI Revolutionizes Voice Interaction: The Dawn Of A New Era In Technology

So what can we do to make sure we’re ready for this universal shift to voice-controlled tech and having natural language conversations with machines? Dengel suggests the answer lies in meeting the challenge head-on. This means drawing together teams made of technologists, engineers, designers, communications experts and business leaders. Their core focus is to identify opportunities and potential risks to the business, allowing them to be managed proactively rather than reactively. “That’s always the first step,” he says, “because you start defining what’s possible, but you’re doing it in the context of what’s realistic as well because you’ve got your tech folks involved as well … ” It’s a “workshop” approach pioneered by Apple and adopted by various tech giants that have found themselves at the forefront of an emerging wave of transformation. But it’s equally applicable to just about any forward-looking business or organization that doesn’t want to be caught off-guard. Dengel says that addressing a group of interns recently, he told them, “I wish I were in your shoes – the next five years is gonna be more innovation than there’s been in the last five or maybe the last 20 years


Level up: Gamify Your Software Security

Gamification has been a great way to increase skills across the industry, and this has become particularly important as adversaries become more sophisticated and robust security becomes a critical piece to business continuity. ... We all love our extrinsic motivators, whether it’s stars or our green squares of activity on GitHub or even our badges and stickers in forums and groups. So why not create a reward system for security too? This makes it possible for developers to earn points, badges or status for successfully integrating security measures into their code, recognizing their achievements. ... Just as support engineers are often rewarded for the speed and volume of tickets they close, similar ideas can be used to advance security practices and hygiene in your organization. Use leaderboards to encourage a healthy competitive spirit and recognize individuals or teams for exceptional security contributions. ... This is in addition to the badges and other rewards mentioned above. I’ve seen recognition programs for other strategic initiatives in organizations, such as “Top Blogger” or “Top Speaker” and even special hoodies or swag awarded to those who achieve the title, giving it exclusivity and prestige.


802.11x: Wi-Fi standards and speeds explained

The big news in wireless is the expected ratification of Wi-Fi 7 (802.11be) by the IEEE standards body early this year. Some vendors are already shipping pre-standard Wi-Fi 7 gear, and the Wi-Fi Alliance announced in January that it has begun certifying Wi-Fi 7 products. While the adoption of Wi-Fi 7 is expected to have the most impact on the wireless market, the IEEE has been busy working on other wireless standards as well. In 2023 alone, the group published 802.11bb, a standard for communication via light waves; 802.11az, which significantly improves location accuracy; and 802.11bd for vehicle-to-vehicle wireless communication. Looking ahead, IEEE working groups are tackling new technology areas, such as enhanced data privacy (802.11bi), WLAN sensing (802.11bf), and randomized and changing MAC addresses (802.11bh). In addition, the IEEE has established special-interest groups to investigate the use of ambient energy harvested from the environment, such as heat, to power IoT devices. There’s a study group looking at standards for high-throughput, low-latency applications such as augmented reality/virtual reality. Another group is developing new algorithms to support AI/ML applications.


What is AI networking? Use cases, benefits and challenges

AI networking can optimize IT service management (ITSM) by handling the most basic level 1 and level 2 support issues (like password resets or hardware glitches). Leveraging NLP, chatbots and virtual agents can field the most common and simple service desk inquiries and help users troubleshoot. AI can also identify higher-level issues that go beyond step-by-step instructions and pass them along for human support. AI networking can also help reduce trouble ticket false-positives by approving or rejecting tickets before they are acted on by the IT help desk. This can reduce the probability that human workers will chase tickets that either weren’t real problems in the first place, were mistakenly submitted or duplicated or were already resolved. ... AI can analyze large amounts of network data and traffic and perform predictive network maintenance. Algorithms can identify patterns, anomalies and trends to anticipate potential issues before they degrade performance or cause unexpected network outages. IT teams can then act on these to prevent — or at least minimize — disruption. AI networking systems can also identify bottlenecks, latency issues and congestion areas. 


Low-Power Wi-Fi Extends Signals Up to 3 Kilometers

Morse Micro has developed a system-on-chip (SoC) design that uses a wireless protocol called Wi-Fi HaLow, based on the IEEE 802.11ah standard. The protocol significantly boosts range by using lower-frequency radio signals that propagate further than conventional Wi-Fi frequencies. It is also low power, and is geared toward providing connectivity for Internet of Things (IoT) applications. To demonstrate the technology’s potential, Morse Micro recently conducted a test on the seafront in San Francisco’s Ocean Beach neighborhood. They showed that two tablets connected over a HaLow network could communicate at distances of up to 3 km while maintaining speeds around 1 megabit per second—enough to support a slightly grainy video call. ... “It is pretty unprecedented range,” says Prakash Guda, vice president of marketing and product management at Morse Micro. “And it’s not just the ability to send pings but actual megabits of data.” The HaLow protocol works in much the same way as conventional Wi-Fi, says Guda, apart from the fact that it operates in the 900-megahertz frequency band rather than the 2.4-gigahertz band. 


How to Make the Most of In-House Software Development

Maintaining an in-house software development team can be tough. You must hire skilled developers – which is no easy feat in today’s economy, where talented programmers remain in short supply – and then manage them on an ongoing basis. You must also ensure that your development team is nimble enough to respond to changing business needs and that it can adapt as your technology stack evolves. Given these challenges, it’s no surprise that most organizations now outsource application development instead of relying on in-house teams. But I’m here to tell you that just because in-house development can be hard doesn’t mean that outsourcing is always the best approach. On the contrary, IT organizations that choose to invest in in-house development for some or all the work can realize lower overall costs and a competitive advantage by creating domain-specific expertise. Keeping development in-house can help organizations address unique security requirements and maintain full control over the development lifecycle and roadmaps. For businesses with specialized technology, security and operational needs, in-house development is often the best strategy.



Quote for the day:

“The first step toward success is taken when you refuse to be a captive of the environment in which you first find yourself.” -- Mark Caine

Daily Tech Digest - January 02, 2024

Decoding the Black Box of AI – Scientists Uncover Unexpected Results

“If the GNNs do what they are expected to, they need to learn the interactions between the compound and target protein and the predictions should be determined by prioritizing specific interactions,” explains Prof. Bajorath. According to the research team’s analyses, however, the six GNNs essentially failed to do so. Most GNNs only learned a few protein-drug interactions and mainly focused on the ligands. Bajorath: “To predict the binding strength of a molecule to a target protein, the models mainly ‘remembered’ chemically similar molecules that they encountered during training and their binding data, regardless of the target protein. These learned chemical similarities then essentially determined the predictions.” According to the scientists, this is largely reminiscent of the “Clever Hans effect”. This effect refers to a horse that could apparently count. How often Hans tapped his hoof was supposed to indicate the result of a calculation. As it turned out later, however, the horse was not able to calculate at all, but deduced expected results from nuances in the facial expressions and gestures of his companion.


Why 2024 is the year for IT managers to revamp their green IT plans

Conversations with enterprise operators reveal that many do not consolidate applications when they refresh their IT equipment and are hesitant to deploy power-aware workload management tools out of concern for impacting the reliability of their IT operations. IT managers must guide their organisations to intelligently utilise their available equipment capacity, using software tools to measure, manage, and maximise utilisation within reliability constraints. All organizations should set equipment utilisation goals and build multi-year efficiency project plans to improve IT infrastructure energy performance. Data available from IT equipment manufacturers indicate that workloads on two to eight old servers can be migrated to one new server when deploying n+1 (e.g., Intel or AMD CPU generation 3 to 4) or n+2 (e.g., Intel or AMD CPU generation 2 to 4) technology. Similar improvements can be achieved in storage and network equipment. Consolidating CPU workload and storage and network operations delivers a defined workload on three-quarters to one-half of the equipment. 


AI Everywhere, All the Time: Top Developments of 2023

AI-powered robots are increasingly automating tasks in manufacturing and logistics, among other industries, driving efficiency and changing the nature of work. Some major milestones include Tesla's Optimus Bot prototype demonstrating dexterous and adaptable humanoid robots, which could shape future automation solutions. Separately, Boston Dynamics' Atlas showcased its parkour skills, paving the way for applications in search and rescue or disaster response. The AlphaFold 2 AI system, developed by Alphabet subsidiary DeepMind, can perform predictions of protein structure, and stands to revolutionize drug discovery and personalized medicine, carrying the potential for helping mitigate numerous diseases. Robotic surgery systems grow ever more sophisticated, while AI-powered prosthetics offer amputees greater control and functionality. AI algorithms are already assisting doctors in medical diagnosis for diseases such as cancer, offering increased accuracy and early detection possibilities. 


How Gamification Can Help Your Business

At work, gamification is often used to build employee experience by promoting fun competition and immersive learning experiences, leading to better information retention and a heightened incentive to engage in ongoing learning and upskilling, Ringman says. Gamification is frequently used to boost staff productivity. “In any business, there are many things that need to be done every day that many of us aren’t naturally motivated to do,” Avila observes. Gamification, provides helpful context, guidance, and rewards, allowing tasks to be completed faster and more efficiently while improving focus. “This, in turn, helps the company achieve larger business goals.” Brands can also tap into gamification as they strive to engage customers and transform ordinary interactions into memorable experiences. Ringman notes that brands can use gamification to add extra fun to loyalty programs by hosting contests and competitions, as well as awarding virtual badges and trophies to customers as they complete various actions or pass significant milestones.


Envisioning a great future – India as a SuperPower

A nation’s growth is underpinned with technological advancement and how swiftly it adopts tech. During the recent state visit of India’s Prime Minister, Narendra Modiji to the United States, he put a lot of emphasis on growing technology that will revolutionize various industries. India is fast moving towards digitization. The thrust from the Government of India with the Digital India initiative and the growing use of digital technology such as Artificial Intelligence, Machine Learning and Data Analytics across various private organisations is bringing a phenomenal shift in India’s growth and development. Secondly, there is going be a lot of disruption in the way we work. With AI, lots of work will be done by BOTS, so it is important to have highly skilled labor to manage the AI which will also require upscaling the work, as we will have more leisure. The way society works will change and need to be adaptable. AI tools can be used as an Add-on tool to enable our lawyers, CAs, economists and leaders at large. Today, India is a force to be reckoned with in the domain of Information Technology without an iota of doubt.


Wi-Fi 7’s mission-critical role in enterprise, industrial networking

Wi-Fi 7 devices can use multi-link operation (MLO) in the 2.4 GHz, 5 GHz, and 6 GHz bands to increase throughput by aggregating multiple links or to quickly move critical applications to the optimal band using seamless switching between links. Fast link switching allows Wi-Fi 7 devices to avoid interference and access Wi-Fi channels without delaying critical traffic. This and other new features also make Wi-Fi 7 ideal for immersive XR/AR/VR, online gaming and other consumer applications that require high throughput, low latency, minimal jitter, and high reliability. ... Naturally there are challenges with achieving seamless connectivity between 5G & Wi-Fi. A lot of industry alignment is needed to enable frictionless movement between networks, across technologies, vendors, and areas such as authentication, QoS, QoE and security. The Wireless Broadband Alliance is playing a key role in bringing all the stakeholders (operators, enterprises and network owners) together to ensure collaboration and alignment on the frameworks that will deliver seamless connectivity.


Data Center Governance Trends to Watch in 2024

Historically, data center governance did not drive frequent conversations in the data center industry. Data center operators sometimes talked about it, but it has not tended to be a core area of concern – perhaps because, unlike other types of governance, data center governance isn't a requirement for businesses seeking to meet regulatory rules or avoid compliance fines. Looking ahead, however, governance in data centers is likely to become a more common item of discussion. Data centers have now matured to the point that businesses are increasingly keen to squeeze as much efficiency as possible out of them. In the past, disorganized data center assets or lack of optimal server room layouts may not have been critical. But today, data center operators face growing pressure to maximize the efficiency of their facilities. Certain regulators are now requiring disclosures about data center emissions, for example, meaning that increasing energy efficiency through effective governance practice has become important for protecting business's brands and reputations.


Essential skills for today’s threat analysts

Very often, for instance, there's an urgent need to communicate a new vulnerability to different audiences, which demands tailored communications for technical teams, CISOs, and board members. Williams highlights task management and patience, especially when dealing with uncertain or misleading information, and above all, coordinating between different sources of information. "So much of threat hunting today relates to that living off the land kind of thing where you're seeing things that look malicious. And so oftentimes you’re developing hypotheses and that involves consulting system admin and working toward a resolution," says Williams. ... It's also a mind game, with threat hunters needing to be highly adaptable as threats are changing daily, sometimes hourly. "You need to change with them. Never allow an inflexible mind to pervade your operational approach," says Brian Hussey, VP of threat hunting, intelligence and DFIR at SentinelOne. At the same time, you also need to see the forest through the trees. "Often threat actors introduce surface changes to their attack patterns, but core modus operandi remains unchanged, leaving important opportunities to identify and eliminate new attacks, even before they arrive," Hussey tells CSO.


Want to tackle technical debt? Sell it as business risk

There is no magic potion that can eliminate all technical debt, but technical debt can be attacked via budgeting if technical debt is not just perceived as upgrading IT infrastructure. What CIOs need to do instead is to present IT infrastructure investment as an important corporate financial and risk management issue that the business can’t afford to ignore. ... Technical budget justifications for IT infrastructure upgrades, which are seldom linked to end business strategies, make it easy for budget decision-makers to defer IT infrastructure investment. Instead, budget decision-makers figure that the company can “make do” because IT will somehow find a way to keep systems running. CIOs must change this thinking. They can start the process by changing IT infrastructure investment justifications from technical explanations to corporate financial and risk management explanations. ... CIOs should also team with the CFO to help reframe the tech debt narrative, because CFOs are always on the lookout for new corporate financial and risk management scenarios. 


Leveraging Leadership: The Fourfold Path to Business Control

Belief systems function as a mechanism for communicating the core values, objectives, and mission of the organization, thus providing guidance and motivation to staff members. By encouraging people to improve their customer service through the inculcation of positive values, conduct, performance, and a feeling of inclusion, this lever ensures the fulfillment of the organization's objectives. In the absence of a clearly-defined Belief System, employees are forced to depend on conjecture regarding the organization's intended behaviors and objectives. ... Without stifling individuals' capacity for innovation or entrepreneurship, this control mechanism permits the development of policies and standards that instruct individuals on bad behavior. Boundary systems implement regulations, codes of conduct, and premeditated strategic boundaries to delineate acceptable and abhorrent employee conduct, thereby establishing governing parameters. These boundaries clearly define the irreversible consequences of violating ethical principles and the potential outcomes that should be avoided. 



Quote for the day:

"It is better to fail in originality than to succeed in imitation." -- Herman Melville

Daily Tech Digest - December 05, 2022

Is SASE right for your organization? 5 key questions to ask

Many analysts say that SASE is particularly beneficial for mid-market companies because it replaces multiple, and often on-premises, tools with a unified cloud service. Many large enterprises, on the other hand, will not only have legacy constraints to consider, but they may also prefer to take a layered security approach with best-of-breed security tools. Another factor to consider is that the SASE offering might be presented as a consolidated solution, but if you dig a little deeper is might actually be a collection of different tools from various partnering vendors, or features obtained through acquisition that have not been fully integrated. Depending on the service provider, SASE offers a unified suite of security services, including but not limited to encryption, multifactor authentication, threat protection, Data Leak Prevention (DLP), DNS, and traditional firewall services. ... With incumbents such as Cisco, VMware, and HPE all rolling out SASE services, enterprises with existing vendor relationships may be able to adopt SASE without needing to worry much about protecting previous investments.


How gamifying cyber training can improve your defences

Gamification is an attempt to enhance systems and activities by creating similar experiences to those in games, in order to motivate and engage users, while building their confidence. This is typically done through the application of game-design elements and game principles (dynamics and mechanics) in non-game contexts. Research into gamification has proved that it has positive effects. ... Gamification has been dismissed by some as a fad, but the application of elements found within game playing, such as competing or collaborating with others and scoring points, can effectively translate into staff training and improve engagement and interest. “The way that cyber security training sessions are happening is changing and it’s for the better,” says Helen McCullagh, a cyber risk specialist for an end-user organisation. “If you look at the engagement of sitting people down and them doing a one-hour course every year, then it is merely a box-ticking exercise. Organisations are trying to get 100% compliance, but what you have are people sitting there doing their shopping list.”


The 3 Phases Of The Metaverse

There are several misconceptions about the metaverse today. In simple terms, the metaverse is the convergence of physical and digital on a digital plane. In its ideal phase, you can access the metaverse from anywhere, just like the internet. Early metaverse apps were focused on creating games with tokenized incentives (play-to-earn) and hadn’t initially been thought of as contributing to the next phase of the internet. One of the most prominent examples is the online game Second Life, which is regarded as the earliest web2-based metaverse platform. Users have an identity projected through an avatar and participate in activities—very much a limited “second” life. ... Unlike the previous phase, Phase 2 is all about creating utilities. Brands, IP holders and companies investing in innovation have been collaborating with gaming metaverse dApps to understand consumer behaviors and economic dynamics. No-coding tools, as well as software development kits, in this phase, are empowering the end user to co-create alongside developers, designers, brands and retail investors. Still, interoperability—the import and export of digital assets—is only possible on a single chain, and the user experience is still seen as gaming in 2-D or 3-D environments.


Why the Agile approach might not be working for your projects

Although Scrum is a well-described methodology, when applied in practice it is often tailored to the specific circumstances of the organisation. These adaptations are often called ScrumBut (“we use Scrum, but …”). Some deviations from the fundamental principles of Scrum, however, may be problematic. These undesirable deviations are called anti-patterns — bad habits formed and influenced by the human factor. What exactly can we consider an anti-pattern? It can be a disagreement on whether or not the task is completed, a disruption caused by the customer, unclear items in the backlog, the indecisiveness of stakeholders (customers, management, etc.), and lack of authority or poor technical knowledge on the part of the Scrum master. We collected detailed information in three Scrum teams using a variety of data collection procedures over a sustained period of time — including observation, surveys, secondary data, and semi-structured interviews – to get a detailed understanding of anti-patterns, and their causes and consequences.


Rise of Data and Asynchronization Hyped Up at AWS re:Invent

Because it was believed that asynchronous programming was difficult, he said, operating systems tended to have restrained interfaces. “If you wanted to write to the disk, you got blocked until the block was written,” Vogels said. Change began to emerge in the 1990s, he said, with operating systems designed from the ground up to expose asynchrony to the world. “Windows NT was probably the first one to have asynchronous communication or interaction with devices as a first principle in the kernel.” Linux, Vogels said, did not pick up asynchrony until the early 2000s. The benefit of asynchrony, he said, is it is natural compared with the illusion of synchrony. When compute systems are tightly coupled together, it could lead to widespread failure if something goes wrong, Vogels said. With asynchronous systems, everything is decoupled. “The most important thing is that this is an architecture that can evolve very easily without have to change any of the other components,” he said. “It is a natural way of isolating failures. If any of the components fails, the whole system continues to work.”


Entity Framework Fundamentals

EF has two ways of managing your database. In this tutorial, I will explain only one of them; code first. The other one is the database first. There is a big difference between them, but code first is the most used. But before we dive in, I want to explain both approaches. Database first is used when there is already a database present and the database will not be managed by code. Code first is used when there is no current database, but you want to create one. I like code first much more because I can write entities (these are basically classes with properties) and let EF update the database accordingly. It's just C# and I don't have to worry about the database much. I can create a class, tell EF it's an entity, update the database, and all is done! Database first is the other way around. You let the database 'decide' what kind of entities you get. You create the database first and create your code accordingly. ... With Entity Framework, it all starts with a context. It associates entities and relationships with an actual database. Entity Framework comes with DbContext, which is the context that we will be using in our code.


How Executive Coaching Can Help You Level Up Your Organization

As we all know, the desire for personal growth is extremely valuable- however, as employee demands from the workplace have shifted, leadership skills have not. As employees climb the ranks, they find their way into leadership without necessarily learning the skills and techniques required to lead. Many new leaders turn to a trusted mentor who would only provide information based on lived experience. On the other hand, executive coaches are tasked with improving performances and capabilities as their day job. But there is a misconception that executive coaches are for leaders who have done something wrong. While it's true that an executive coach could support a difficult employee become a better teammate, they can also be guides for leaders to pursue their desired career paths. Leadership coaching explains that the main drivers of innovation in an organization are the people and the corporate culture, and it can provide leaders with the tools to master these levers. An executive coaching professional can guide leaders through the steps that allow them to set the foundations of an innovative and competitive company.


Ransomware: Is there hope beyond the overhyped?

The old way of thinking about cyber security was imagining it like a castle. You’ve got the vast perimeter – the castle walls – and inside was the keep, where employees and data would live. But now organisations are operating in various locations. They’ve got their cloud estate in one or more providers, source code residing in another location, and vast amounts of work devices that are now no longer behind the castle walls, but at employees’ homes – the list could go on for ever. These are all areas that could potentially be breached and used to gain intelligence on the business. The attack surface is growing, and the castle wall can no longer circle around all these places to protect them. Attack surface management will play a big part in tackling this issue. It allows security and IT teams to almost visualise the external parts of the business and identify targets and assesses risks based on the opportunities they present to a malicious attacker. In the face of a constantly growing attack surface, this can enable businesses to establish a proactive security approach and adopt principles such as assume breach and cyber resilience.


How data analysts can help CIOs bridge the tech talent shortfall

Business analytics are only as good as the data they’re using. Given the wealth and complexity of data, it’s easy to understand why leaders are often overwhelmed in their attempts to access better analytics and insights. This is where data professionals can help. Data scientists and analysts are statistics, math, databases, and systems experts. They are especially adept at looking at historical metrics, recognizing patterns, pulling in market insights, and identifying outlier data to ensure the best points are utilized. They’re also able to organize vast amounts of unstructured data, which is often very valuable but difficult to analyze, by leveraging conventional databases and other tools to make the data more actionable. ... It’s also important to look at the attributes of the data scientists and analysts themselves. In addition to having technical skills, data professionals with a background in programming, data visualization, and machine learning are also highly valuable. On the non-technical side, they should have strong interpersonal and communication skills to relay their findings to the tech team and those without a tech or math background.


What Does Technical Debt Tell You?

Making most architectural decisions at the beginning of a project, often before the QARs are precisely defined, results in an upfront architecture that may not be easy to evolve and will probably need to be significantly refactored when the QARs are better defined. Contrastingly, having a continuous flow of architectural decisions as part of each Sprint results in an agile architecture that can better respond to QAR changes. Almost every architectural decision is a trade-off between at least two QARs. For example, consider security vs. usability. Regardless of the decision being made, it is likely to increase technical debt, either by making the system more vulnerable by giving priority to usability or making it less usable by giving priority to security. Either way, this will need to be addressed at some point in the future, as the user population increases, and the initial decision to prioritize one QAR over the other may need to be reversed to keep the technical debt manageable. Other examples include scalability vs. modifiability, and scalability vs. time to market. These decisions are often characterized as "satisficing", i.e., "good enough". 



Quote for the day:

"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr

Daily Tech Digest - October 15, 2021

You’ve migrated to the cloud, now what?

When thinking about cost governance, for example, in an on-premises infrastructure world, costs increase in increments when we purchase equipment, sign a vendor contract, or hire staff. These items are relatively easy to control because they require management approval and are usually subject to rigid oversight. In the cloud, however, an enterprise might have 500 virtual machines one minute and 5,000 a few minutes later when autoscaling functions engage to meet demand. Similar differences abound in security management and workload reliability. Technologies leaders with legacy thinking are faced with harsh trade-offs between control and the benefits of cloud. These benefits can include agility, scalability, lower cost, and innovation and require heavy reliance on automation rather than manual legacy processes. This means that the skillsets of an existing team may be not the same skillsets needed in the new cloud order. When writing a few lines of code supplants plugging in drives and running cable, team members often feel threatened. This can mean that success requires not only a different way of thinking but also a different style of leadership.


A new edge in global stability: What does space security entail for states?

Observers recently recentred the debate on a particular aspect of space security, namely anti-satellite (ASAT) technologies. The destruction of assets placed in outer space is high on the list of issues they identify as most pressing and requiring immediate action. As a result, some researchers and experts rolled out propositions to advance a transparent and cooperative approach, promoting the cessation of destructive operations in both outer space and launched from the ground. One approach was the development of ASAT Test Guidelines, first initiated in 2013 by a Group of Governmental Experts on Outer Space Transparency and Confidence-Building Measures. Another is through general calls to ban anti-satellite tests, to not only build a more comprehensive arms control regime for outer space and prevent the production of debris, but also reduce threats to space security and regulate destabilising force. Many space community members threw their support behind a letter urging the United Nations (UN) General Assembly to take up for consideration a kinetic anti-satellite (ASAT) Test Ban Treaty for maintaining safe access to Earth orbit and decreasing concerns about collisions and the proliferation of space debris.


From data to knowledge and AI via graphs: Technology to support a knowledge-based economy

Leveraging connections in data is a prominent way of getting value out of data. Graph is the best way of leveraging connections, and graph databases excel at this. Graph databases make expressing and querying connection easy and powerful. This is why graph databases are a good match in use cases that require leveraging connections in data: Anti-fraud, Recommendations, Customer 360 or Master Data Management. From operational applications to analytics, and from data integration to machine learning, graph gives you an edge. There is a difference between graph analytics and graph databases. Graph analytics can be performed on any back end, as they only require reading graph-shaped data. Graph databases are databases with the ability to fully support both read and write, utilizing a graph data model, API and query language. Graph databases have been around for a long time, but the attention they have been getting since 2017 is off the charts. AWS and Microsoft moving in the domain, with Neptune and Cosmos DB respectively, exposed graph databases to a wider audience.


Observability Is the New Kubernetes

So where will observability head in the next two to five years? Fong-Jones said the next step is to support developers in adding instrumentation to code, expressing a need to strike a balance between easy and out of the box and annotations and customizations per use case. Suereth said that the OpenTelemetry project is heading in the next five years toward being useful to app developers, where instrumentation can be particularly expensive. “Target devs to provide observability for operations instead of the opposite. That’s done through stability and protocols.” He said that right now observability right now, like with Prometheus, is much more focused on operations rather than developer languages. “I think we’re going to start to see applications providing observability as part of their own profile.” Suereth continued that the OpenTelemetry open source project has an objective to have an API with all the traces, logs and metrics with a single pull, but it’s still to be determined how much data should be attached to it.


Data Exploration, Understanding, and Visualization

Many scaling methods require knowledge of critical values within the feature distribution and can cause data leakage. For example, a min-max scaler should fit training data only rather than the entire data set. When the minimum or maximum is in the test set, you have reduced some data leakage into the prediction process. ... The one-dimensional frequency plot shown below each distribution provides understanding to the data. At first glance, this information looks redundant, but these directly address problems when representing data in histograms or as distributions. For example, when data is transformed into a histogram, the number of bins is specified. It is difficult to decipher any pattern with too many bins, and with too few bins, the data distribution is lost. Moreover, representing data as a distribution assumes the data is continuous. When data is not continuous, this may indicate an error in the data or an important detail about the feature. The one-dimensional frequency plots fill in the gaps where histograms fail.


DevSecOps: A Complete Guide

Both DevOps and DevSecOps use some degree of automation for simple tasks, freeing up time for developers to focus on more important aspects of the software. The concept of continuous processes applies to both practices, ensuring that the main objectives of development, operation, or security are met at each stage. This prevents bottlenecks in the pipeline and allows teams and technologies to work in unison. By working together, development, operational or security experts can write new applications and software updates in a timely fashion, monitor, log, and assess the codebase and security perimeter as well as roll out new and improved codebase with a central repository. The main difference between DevOps and DevSecOps is quite clear. The latter incorporates a renewed focus on security that was previously overlooked by other methodologies and frameworks. In the past, the speed at which a new application could be created and released was emphasized, only to be stuck in a frustrating silo as cybersecurity experts reviewed the code and pointed out security vulnerabilities.


Skilling employees at scale: Changing the corporate learning paradigm

Corporate skilling programs have been founded on frameworks and models from the world of academia. Even when we have moved to digital learning platforms, the core tenets of these programs tend to remain the same. There is a standard course with finite learning material, a uniformly structured progression to navigate the learning, and the exact same assessment tool to measure progress. This uniformity and standardization have been the only approach for organizations to skill their employees at scale. As a result, organizations made a trade-off; content-heavy learning solutions which focus on knowledge dissemination but offer no way to measure the benefit and are limited to vanity metrics have become the norm for training the workforce at large. On the other hand, one-on-one coaching programs that promise results are exclusive only to the top one or two percent of the workforce, usually reserved for high-performing or high-potential employees. This is because such programs have a clear, measurable, and direct impact on behavioral change and job performance.


The Ultimate SaaS Security Posture Management (SSPM) Checklist

The capability of governance across the whole SaaS estate is both nuanced and complicated. While the native security controls of SaaS apps are often robust, it falls on the responsibility of the organization to ensure that all configurations are properly set — from global settings, to every user role and privilege. It only takes one unknowing SaaS admin to change a setting or share the wrong report and confidential company data is exposed. The security team is burdened with knowing every app, user and configuration and ensuring they are all compliant with industry and company policy. Effective SSPM solutions come to answer these pains and provide full visibility into the company's SaaS security posture, checking for compliance with industry standards and company policy. Some solutions even offer the ability to remediate right from within the solution. As a result, an SSPM tool can significantly improve security-team efficiency and protect company data by automating the remediation of misconfigurations throughout the increasingly complex SaaS estate.


Why gamification is a great tool for employee engagement

Gamification is the beating heart of almost everything we touch in the digital world. With employees working remotely, this is the golden solution for employers. If applied in the right format, gaming can help create engagement in today's remote working environment, motivate personal growth, and encourage continuous improvement across an organization. ... In the connected workspace, gamification is essentially a method of providing simple goals and motivations that rely on digital rather than in-person engagement. At the same time, there is a tacit understanding among both game designer and "player" that when these goals are aligned in a way that benefits the organization, the rewards often impact more than the bottom line. Engaged employees are a valuable part of defined business goals, and studies show that non-engagement impacts the bottom line. At the same time, motivated employees are more likely to want to make the customer experience as satisfying as possible, especially if there is internal recognition of a job well done.


10 Cloud Deficiencies You Should Know

What happens if your cloud environment goes down due to challenges outside your control? If your answer is “Eek, I don’t want to think about that!” you’re not prepared enough. Disaster preparedness plans can include running your workload across multiple availability zones or regions, or even in a multicloud environment. Make sure you have stakeholders (and back-up stakeholders) assigned to any manual tasks, such as switching to backup instances or relaunching from a system restore point. Remember, don’t wait until you’re faced with a worst-case scenario to test your response. Set up drills and trial runs to make sure your ducks are quacking in a row. One thing you might not imagine the cloud being is … boring. Without cloud automation, there are a lot of manual and tedious tasks to complete, and if you have 100 VMs, they’ll require constant monitoring, configuration and management 100 times over. You’ll need to think about configuring VMs according to your business requirements, setting up virtual networks, adjusting for scale and even managing availability and performance. 



Quote for the day:

"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward

Daily Tech Digest - September 27, 2021

How to Get Started With Zero Trust in a SaaS Environment

While opinions vary on what zero trust is and is not, this security model generally considers the user's identity as the root of decision-making when determining whether to allow access to an information resource. This contrasts with earlier approaches that made decisions based on the network from which the person was connecting. For example, we often presumed that workers in the office were connecting directly to the organization's network and, therefore, could be trusted to access the company's data. Today, however, organizations can no longer grant special privileges based on the assumption that the request is coming from a trusted network. With the high number of remote and geographically dispersed employees, there is a good chance the connections originate from a network the company doesn't control. This trend will continue. IT and security decision-makers expect remote end users to account for 40% of their workforce after the COVID-19 outbreak is controlled, an increase of 74% relative to pre-pandemic levels, according to "The Current State of the IT Asset Visibility Gap and Post-Pandemic Preparedness," with research conducted by the Enterprise Strategy Group for Axonius.


Tons Of Data At The Company Store

Confidentially, many chief data officers will admit that their companies suffer from what might euphemistically be called “data dyspepsia:” they produce and ingest so much data that they cannot properly digest it. Like it or not, there is such a thing as too much data – especially in an era of all-you-can-ingest data comestibles. “Our belief is that more young companies die of indigestion than starvation,” said Adam Wilson, CEO of data engineering specialist Trifacta, during a recent episode of Inside Analysis, a weekly data- and analytics-focused program hosted by Eric Kavanagh. So what if Wilson was referring specifically to Trifacta’s decision to stay focused on its core competency, data engineering, instead of diversifying into adjacent markets. So what if he was not, in fact, alluding to a status quo in which the average business feels overwhelmed by data. Wilson’s metaphor is no less apt if applied to data dyspepsia. It also fits with Trifacta’s own pitch, which involves simplifying data engineering – and automating it, insofar as is practicable – in order to accelerate the rate at which useful data can be made available to more and different kinds of consumers.


Hyperconverged analytics continues to guide Tibco strategy

One of the trends we're seeing is that people know how to build models, but there are two challenges. One is on the input side and one is on the output side. On the input side, you can build the greatest models in the world, but if you feed them bad data that's not going to help. So there's a renewed interest around things like data governance, data quality and data security. AI and ML are still very important, but there's more to it than just building the models. The quality of the data, and the governance and processes around the data, are also very important. That way you get your model better data, which makes your model more accurate, and from there you're going to get better outcomes. On the output side, since there are so many models being built, organizations are having trouble operationalizing them all. How do you deploy them into production, how do you monitor them, how do you know when it's time to go back and rework that model, how do you deploy them at the edge, how do you deploy them in the cloud and how do you deploy them in an application? 


Gamification: A Strategy for Enterprises to Enable Digital Product Practices

As digital products take precedence, the software ecosystem brings new possibilities to products. With the rise of digital products, cross-functional boundaries are blurring. New skills and unlearning old ways are critical. Gamification can support creating a ladder approach to acquiring and utilizing new skills for continuous software delivery ecosystems, testing and security. However, underpinning collective wisdom through gamification needs a systematic framework where we are able to integrate game ideation, design, validation & incentives with different persona types. To apply gamification in a systematic manner to solve serious problems, ideate, and come together to create new knowledge in a fun way, is challenging. To successfully apply gamification for upskilling and boosting productivity, it will have to be accompanied by understanding the purposefulness through the following two critical perspectives: Benefits of embracing gamification for people – Removing fear, having fun, and making the desirable shift towards new knowledge; creating an environment that is inclusive and can provide a learning ecosystem for all. 


Artificial Intelligence: The Future Of Cybersecurity?

Cybersecurity in Industry 4.0 can't be tackled in the same way as that of traditional computing environments. The number of devices and associated challenges are far too many. Imagine monitoring security alerts for millions of connected devices globally. IIoT devices possess limited computing power and, therefore, lack the ability to run security solutions. This is where AI and machine learning come into play. ML can make up for the lack of security teams. AI can help discover devices and hidden patterns while processing large amounts of data. ML can help monitor incoming and outgoing traffic for any deviations in behavior in the IoT ecosystem. If a threat or anomaly is detected, alarms can be sent to security admins warning them about the suspicious traffic. AI and ML can be used to build lightweight endpoint detection technologies. This can be an indispensable solution, especially in situations where IoT devices lack the processing power and need behavior-based detection capabilities that aren't as resource intensive. AI and ML technologies are a double-edged sword. 


3 ways any company can guard against insider threats this October

Companies don’t become cyber smart by accident. In fact, cybersecurity is rarely top-of-mind for the average employee as they go about their day and pursue their professional responsibilities. Therefore, businesses are responsible for educating their workforce, training their teams to identify and defend against the latest threat patterns. For instance, phishing scams have increased significantly since the pandemic’s onset, and each malicious message threatens to undermine data integrity. Meanwhile, many employees can’t identify these threats, and they wouldn’t know how to respond if they did. Of course, education isn’t limited to phishing scams. One survey found that 61 percent of employees failed a basic quiz on cybersecurity fundamentals. With the average company spending only 5 percent of its IT budget on employee training, it’s clear that education is an untapped opportunity for many organizations to #BeCyberSmart. When coupled with intentional accountability measures that ensure training is implemented, companies can transform their unaware employees into incredible defensive assets.


VMware gears up for a challenging future

“What we are doing is pivoting our portfolio or positioning our portfolio to become the multi-cloud platform for our customers in three ways,” Raghuram said. “One is enabling them to execute their application transformation on the cloud of their choice using our Tanzu portfolio. And Tanzu is getting increased momentum, especially in the public cloud to help them master the complexities of doing application modernization in the cloud. And of course, by putting our cloud infrastructure across all clouds, and we are the only one with the cloud infrastructure across all clouds and forming the strategic partnerships with all of the cloud vendors, we are helping them take their enterprise applications to the right cloud,” Raghuram said. Building useful modern enterprise applications is a core customer concern, experts say. “Most new apps are built-on containers for speed and scalability. The clear winner of the container wars was Kubernetes,” said Scott Miller, senior director of strategic partnerships for World Wide Technology (WWT), a technology and supply-chain service provider and a VMware partner. 


Software cybersecurity labels face practical, cost challenges

Cost and feasibility are among the top challenges of creating consumer labels for software. Adding to these challenges is the fact that software is continually updated. Moreover, software comes in both open-source and proprietary formats and is created by a global ecosystem of firms that range from mom-and-pop shops all the way up to Silicon Valley software giants. "It's way too easy to create requirements that cannot be met in the real world," David Wheeler, director of open source supply chain security at the Linux Foundation and leader of the Core Infrastructure Initiative Best Practices Badge program, said at the workshop. "A lot of open-source projects allow people to use them at no cost. There's often no revenue stream. You have to spend a million dollars at an independent lab for an audit. [That] ignores the reality that for many projects, that's an impractical burden." ... Another critical aspect of creating software labels is to ensure that they don't reflect static points in time but are instead dynamic, taking into account the fluid nature of software. 


Work’s not getting any easier for parents

Part of many managers’ discomfort with remote work is that they are unsure how to gauge their off-site employees’ performance and productivity. Some business leaders equate face time with productivity. I’ll never forget a visit I had to a Silicon Valley startup in which the manager showing me around described a colleague this way: “He’s such a great worker. He’s here every night until 10, and back in early every morning!” In my work helping businesses update their policies and cultures to accommodate caregivers, I often have to rid managers of this old notion. There’s nothing impressive, or even good, about being in the office so much. To help change the paradigm, I work with managers to find new ways of measuring an individual’s performance and productivity. Instead of focusing on hours worked per day, we look at an employee’s achievements across a broader time metric, such as a month or quarter. We ask, what did the employee do for the company during that time? It’s often then that businesses realize how little overlap there is between those who are seen working the most and those who have the greatest impact on the company. 


How to use feedback loops to improve your team's performance

In systems, feedback is a fundamental force behind their workings. When we fly a plane, we get feedback from our instruments and our co-pilot. When we develop software, we get feedback from our compiler, our tests, our peers, our monitoring, and our users. Dissent works because it’s a form of feedback, and clear, rapid feedback is essential for a well functioning system. As examined in “Accelerate”, a four-year study of thousands of technology organizations found that fostering a culture that openly shares information is a sure way to improve software delivery performance. It even predicts ability to meet non-technical goals. These cultures, known as “generative” in Ron Westrum’s model of organizational culture, are performance–and learning–oriented. They understand that information, especially if it’s difficult to receive, only helps to achieve their mission, and so, without fear of retaliation, associates speak up more frequently than in rule-oriented (“bureaucratic”) or power-oriented (“pathological”) cultures. Messengers are praised, not shot.



Quote for the day:

"A pat on the back is only a few vertebrae removed from a kick in the pants, but is miles ahead in results." -- W. Wilcox