Posts

Appointing A Chief Data Officer To Increase The Economic Value Of Businesses

“There is a new position, the Chief Data Officer. It’s a good idea, but there has been poor execution. What has been happening is taking a CIO and giving them a new title of CDO. However, it should be the Chief Data Monetisation Officer. The job is to determine how to monetise the data you have available. This should be an economics person rather than IT person.” – Jacob Morgan: Principal of Chess Media Group.

Businesses making it a top priority to bring in a Chief Data Officer are doing so as a means to ensure the quality, governance and performance of their Big Data projects are at their best. The threat of losing opportunities from disruptive innovation and the fear of being unable to manage the exponential growth of data has been a key reason for the large increase in hiring for this role.

“For some organisations today, data has become such an explosive part of business that they have created a Chief Data Officer (CDO) position to reside next to the Chief Information Officer and the Chief Technology Officer. This evolution clearly acknowledges that data in the business setting is separate from the systems running it. Beyond that, it recognises that data has a value that, if fully exploited, can help drive profitable business.” – Wired.

With business acumen, ability to lead change and suitable IT awareness as initial qualifiers, there are many other factors that an executive leader should take into consideration before taking the leap and appointing a Chief Data Officer. Here are a few;

 

#1 – Establishing A Clear Outline Of Roles & Responsibilities

First and foremost, in order ensure the new executive you’re bringing on board is set up on a path to success, it’s important to present the leadership team with a clear definition of roles & responsibilites, and a solid understanding of what the organisation is hoping to achieve. This will help the CDO create a roadmap that is aligned with the organisation’s goals and highlights potential obstacles that need to be addressed, as well as minimise border skirmishes with CIO and CTO peers. The CDO must be suitably empowered and supported to equip them to succeed. The role is unlikely to deliver the required value to the business without authority and support, especially given the CDO’s remit can include challenging existing practices and contributing to digital transformation within the organisation.

“A successful roadmap should divide the implementation into logical phases in order to reduce implementation risk. Phases should be around three months in duration. Taking on all the metrics and goals at the same time or in large chunks is very risky primarily because business users lose interest if they are not engaged on an ongoing basis. Prioritise your roadmap phases in order of importance to your business so that you reap the most benefits from your analytics early in your roadmap and provide justification for additional phases. Strong early success provides the critical mass and positive impression about analytics which leads to stronger business adoption.” – StatSlice.

 

#2 – Building The Right Team

“As well as a financial cost, there’s obviously also a cost in human resources and time. If you have data scientists bumbling their way through hundreds of projects with no clear aim, or decoding terabytes of data you have no clear, immediate use for, they’re likely to be unavailable, or distracted, when something of real value comes along. Having the right people with the right skills in the right place is essential.” – Talend.

Part of the responsibility of a Chief Data Officer is to hire the right team and effectively navigate the success of Big Data projects. In order to put together an A-level team, there needs to be a clear set of qualities, characteristics and expectations of prior experience that the CDO must look out for in the hiring process. A considered approach to recruitment and selection, recognising the change process the business must navigate, will help to select the stand-out candidates that are most suitable for the role.

One example is hiring a Data Scientist. Some of the most important traits include statistical thinking, good communication skills, creativity, curiosity, and of course, the right technical skills.

“A great data scientist has a hacker’s spirit. Technical flexibility is as important as experience, because in this field the gold standards change with an alarming rate. Data scientists work together, love open source, and share our knowledge and experience to make sure that we can move at the speed of demand. If your data scientist is a quick study, you’ve made a sound investment beyond the current trend cycle.” – Datascope Analytics.

 

#3 – Strategic Allocation Of Budget & Resources

“Analytics – the ability to find meaningful patterns in data – can help manage costs, lead to efficiency and better decisions, increase services and make better use of capital.” – Carlos Londono: Global Supply Chain VP at Owens Illinois Inc.

A CDO is responsible for the cost, schedule, delegation of tasks, coaching and technical performance of a Big Data project. In order to be able to implement change, invest in the right technology and systems for processing data, oversee and guide the team and achieve a profitable outcome, effective project management techniques must be adopted to keep track of whether objectives and KPIs are being met.

Among these is also the responsibility to determine which which project management method is most suitable for the project, a popular choice among many organisations being the Agile method.

“By delivering the work in small increments of working – even production ready – software, those assumptions are all validated early on. All code, design, architecture and requirements are validated every time a new increment is delivered, even the plan is validated as teams get real and accurate data around the progress of the project. But the early validation is not the only benefit that Agile brings, it also allows projects to learn from the feedback, take in new or changing requirements and quickly change direction when necessary, without changing the process at all.” – Gino Marckx: Founder & Business Improvement Consultant at Xodiac Inc.

 

 

For more resources, please see below:

The Rise Of The Chief Data Officer

Six Qualities Of A Great Data Scientist

Developing A Business Analytics Roadmap

Where Is Technology Taking The Economy?

Staffing Strategies For The Chief Data Officer

12 Qualities Your Next Chief Data Officer Should Have

Why Businesses That Use “Big Data” Make More Money

Making Data Analytics Work For You – Instead Of The Other Way Around

How To Turn Any Big Data Project Into A Success (And Key Pitfalls To Avoid)

 

To discuss this and other topics, please contact the team at Contexti – + 61 28294 2161 | connect@contexti.com

Insights From Five Companies Winning With Big Data Analytics

Harnessing the power of Big Data, and finding the right set of tools that will enable your business to efficiently generate value from it comes with its challenges. Successfully utilising the power of technology starts with a shift in culture, adopting a data-driven mindset and clearly identifying the business challenges you are looking to address with data analytics.

“The biggest challenge of making the evolution from a knowing culture to a learning culture—from a culture that largely depends on heuristics in decision making to a culture that is much more objective and data driven and embraces the power of data and technology—is really not the cost. Initially, it largely ends up being imagination and inertia.” – Murli Buluswar: Chief Science Officer at AIG

Businesses can use information derived from data to increase their efficiency and success in many ways, like automating processes and gaining in-depth knowledge of target markets. This month, we’ve gained insights from five businesses who are front-runners in the data analytics game.

 

#1 – AMAZON

“The next time you contact the Amazon help desk with a query, don’t be surprised when the employee on the other end already has most of the pertinent information about you on hand. This allows for a faster, more efficient customer service experience that doesn’t include having to spell out your name three times.” Eleanor O’Neill: Writer at ICAS.

Amazon, the online retail giant, has mastered the art of ecommerce. By embracing cutting edge technology to analyse and make use of the massive amount of customer data they have access to, they have become the pros of supply chain optimisation, price optimisation and fraud detection. With sophisticated advertising algorithms, and leveraging their
Amazon Elastic MapReduce platform for machine learning, the company has built an empire by providing goods to their customers faster and cheaper than their competitors, as well as exceptional customer service.

“Amazon.com Inc is a leader in collecting, storing, processing and analysing personal information from you and every other customer as a means of determining how customers are spending their money. The company uses predictive analytics for targeted marketing to increase customer satisfaction and build company loyalty.” – Jennifer Wills: Owner of JDW Writing.

 

#2 – GOOGLE

“Google is of course an expert in Big Data. They have developed many open source tools and technologies that are widely used in the big data ecosystem. Using many different Big Data techniques, it is capable of sifting through millions of websites and petabytes of data and to give you the right answer within milliseconds. How do they do that?” – Datafloq.

Aside from their impressive search engine, google’s strategy of mining data and placing targeted ads in front of customers who have used free google products before has been a key factor in their success, allowing them to track customers based on their behavior and interests. Google’s service offering to businesses looking to get their ads in front of the right customers has been a huge revenue builder for the organisation.

“Google has not only significantly influenced the way we can now analyse Big Data (think MapReduce, BigQuery, etc.) – but they probably are more responsible than anyone else for making it part of our everyday lives. I believe that many of the innovative things Google is doing today, most companies will do in years to come. Although these days Google’s Big Data innovation goes well beyond basic search, it’s still their core business.” – Bernard Marr: Founder & CEO of Bernard Marr & Co.

 

#3 – NETFLIX

With a user base of approximately 99 million, data scientists at Netflix collect and analyse a colossal amount of behavioral data to reveal insights for decision-making in a way that differentiates them from competitors like Stan and Amazon Prime Video.

“From predicting the kind of content that would garner high viewership to recommending content to specific users, Netflix uses data everywhere. In fact, since its days of being a DVD-by-mail service, Netflix placed prime importance on collecting user data and building a recommendation system. Cinematch was the first algorithm behind their recommendation system. After launching their streaming media service in 2007, it took them 6 years to collect enough data to predict the sure-shot success of their first original production ‘House of Cards’. Data accumulated from numerous sources influence decisions regarding shows. Not only user data, Netflix also observe data generated by piracy sites. “Prison Break” is a hit show on that front.” – Toai Chowdhury: Author at upX Academy.

 

#4 – AMERICAN EXPRESS

“The AMEX team now comprises 800 data scientists globally. American Express claims the lowest fraud loss rate on their records, and among the lowest in the industry. The company states that benefits from fraud improvement alone have paid for their investments in Big Data.” – Randy Bean: CEO & Founder of NewVantage Partners LLC.

AMEX has improved their identification of customer attrition using IBM’s SPSS predictive analytics modelling software. The model delivers a list of prospective customers at highest risk, which allows the organisation to communicate with methods such as direct marketing and follow-up calls.

“American Express increasingly is moving away from focusing on its traditional function of providing credit for consumers and providing merchant services for processing transactions, and toward actually making the connection between consumers and the businesses that want to reach them. The company is using its vast data flows to develop apps that can connect a cardholder with products or services. One app looks at past purchase data and then recommends restaurants in the area that the user is likely to enjoy.” – Bernard Marr: Founder & CEO of Bernard Marr & Co.

 

#5 – APPLE

“With the help of Big Data Analytics and Hadoop cloud, Apple has positioned itself as not just one of the best tech companies around, but one of the best companies period. That reign will likely continue into the future as Apple utilises Big Data in new and exciting ways.” – Jonathan Buckley: Founder & Principal of The Artesian Network LLC.

Apple’s partnership with enterprise experts like Cisco, Deloitte, IBM and SAP has impacted their success as a powerful presence in the mobile market, with millions of loyal customers around the world. The wide range of apps they have released for banking, insurance, travel and entertainment; and the launch of wearable devices like the iWatch, Apple is collecting more customer data than ever before.

“As well as positioning itself as an ‘enabler’ of Big Data in other people’s lives, it has also been put to use in its own internal systems. Apple has often been secretive about the processes behind its traditionally greatest strength – product design. However it is known that Big Data also plays a part here. Data is collected about how, when and where its products – Smart phones, tablets, computers and now watches – are used, to determine what new features should be added, or how the way they are operated can be tweaked to provide the most comfortable and logical user experience.” – Bernard Marr: Founder & CEO of Bernard Marr & Co.

 

 

For more resources, please see below:

10 Companies That Are Using Big Data

How Companies Are Using Big Data & Analytics

6 Ways To Win In Business With Big Data Analytics

16 Case Studies of Companies Proving ROI of Big Data

 

Google

Wow! Big Data At Google

How Google Applies Big Data To Know You

What Would Google Do? Leveraging Data Analytics To Grow Your Organisation

 

Apple

How Apple Is Using Big Data

How Apple Uses Big Data To Drive Business Success

 

Amazon

Amazon EMR

How Amazon Is Leveraging Big Data

7 Ways Amazon Uses Big Data To Stalk You

How Amazon Became The World’s Largest Online Retailer

 

American Express

Inside American Express’ Big Data Journey

American Express Charges Into The World of Big Data

How Predictive Analytics Is Tackling Customer Attrition At American Express

 

Netflix

Big Data: How Netflix Uses It To Drive Business Success

How Netflix Uses Big Data Analytics To Ensure Success

Deep Learning Technologies Enabling Innovation

“Deep Learning has had a huge impact on computer science, making it possible to explore new frontiers of research and to develop amazingly useful products that millions of people use every day.” – Rajat Monga, Engineering Director at TensorFlow & Jeff Dean, Senior Fellow at Google.

With innovation driving business success, the demand for community-based, open-source software that incorporates AI & deep learning is taking over start-ups and enterprises alike. We’ve rounded up a few successful deep learning technologies that are making a big impact.

 

#1 – TensorFlow

TensorFlow is an open source software library that uses data flow graphs for numerical computation. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays communicated between them. With extensive built-in support for deep learning, TensorFlow can compute any algorithm that can be expressed in a computational flow graph.

“TensorFlow was built from the ground up to be fast, portable, and ready for production service. You can move your idea seamlessly from training on your desktop GPU to running on your mobile phone. And you can get started quickly with powerful machine learning tech by using our state-of-the-art example model architectures.” – Google Research Blog.

 

#2 – IBM PowerAI

Offering a collection of open-source frameworks for deep learning in one installable package, IBM Power AI claims to simplify  the installation and system optimisation required to bring up a deep learning infrastructure.

“PowerAI makes deep learning, machine learning, and AI more accessible and more performant. By combining this software platform for deep learning with IBM® Power Systems™, enterprises can rapidly deploy a fully optimised and supported platform for machine learning with blazing performance. The PowerAI platform includes the most popular machine learning frameworks and their dependencies, and it is built for easy and rapid deployment. PowerAI requires installation on IBM Power Systems S822LC for HPC server infrastructure.” – IBM

 

#3 – Intel Nervana

Nervana Systems, acquired by Intel last year, is now known as Intel Nervana and referred to as ‘the next big shift inside corporate data centers.’

“Nervana has built an extensive machine learning system, which runs the gamut from an open-sourced software platform all the way down to an upcoming customised computer chip. The platform is used for everything from analysing seismic data to find promising places to drill for oil to looking at plant genomes in search of new hybrids.” – Aaron Pressman: Senior Writer at Fortune.

This state-of-the-art deep learning system is made up of curated, enterprise-grade collections of the world’s most advanced deep learning models and is updated on a regular basis.

“The Intel® Nervana™ Deep Learning Studio, a suite of tools with an easy-to-use interface, dramatically simplifies the deep learning process and accelerates time-to-solution. After you import your data, you can extend one of our state-of-the-art models or build your own. Then, you can kick off training with single click and track progress on the dashboard. All the capabilities of the platform are also accessible via a powerful command line interface.” – Intel Nervana.

 

#4 – NVIDIA Deep Learning SDK

‘The NVIDIA Deep Learning SDK provides high-performance tools and libraries to power innovative GPU-accelerated machine learning applications in the cloud, data centers, workstations, and embedded platforms.’ – NVIDIA.

Offering a comprehensive development environment for building new GPU-accelerated deep learning algorithms, and the inclusion of libraries for deep learning primitives, inference, video analytics, linear algebra, sparse matrices, and multi-GPU communications, your business could dramatically increase the performance of existing applications.

“With the updated Deep Learning SDK optimised for Volta, developers have access to the libraries and tools that ensure seamless development and deployment of deep neural networks on all NVIDIA platforms, from the cloud or data center to the desktop to embedded edge devices. Deep learning frameworks using the latest updates deliver up to 2.5x faster training of CNNs, 3x faster training of RNNs and 3.5x faster inference on Volta GPUs compared to Pascal GPUs.” – NVIDIA.

 

 

For more resources, please see below:

IBM Power AI

Intel Nervana Platform

Why Deep Learning Is Suddenly Changing Your Life

Nividia Accelerated Computing – Deep Learning Software

Why Intel Bought Artificial Intelligence Startup Nervana Systems

TensorFlow – Google’s Latest Machine Learning System, Open Sourced For Everyone

Intel Is Paying More Than $400 Million To Buy Deep-Learning Startup Nervana Systems

PowerAI: The World’s Fastest Deep Learning Solution Among Leading Enterprise Servers

Data’s Growing Potential To Transform Business

“Big Data does not only refer to online activity but also to behaviour offline, including use of credit cards or even smartphones, which send GPS locations and records behaviour. The existence of large volumes of data that can be used for different applications provides those willing to data mine and analyse with several opportunities.” – Daniel Abela: Owner & Managing Director at Redorange.

In the past few years, Big Data analytics has become a game-changer for many businesses worldwide, with profitable outcomes achieved in successful startups like Treasure Data and MapD, and large enterprises like Amazon and Apple. With new and innovative technologies continuing to launch at a rapid pace, the potential for growth won’t be slowing down anytime soon.

“The integrated use of analytics, Big Data, the cloud, the Internet of Things (“IoT”), mobile, and application development—is driving change at unprecedented rates. Our digital economy is subject to Moore’s law and digital transformation has become the new normal.” – Forbes.

Here’s some examples of how you can use data analytics to grow your business.

 

#1 – Business Intelligence For Better Decision-Making

“No matter what BI application is used, the reality is that organisations are continuously searching for ways to get more value out of their data. BI provides one of the best ways to transform data sources into interactive information that can lead to better decision making and planning.” – Lyndsay Wise: Solution Director at Information Builders.

The aim of business intelligence is to generate value, insight and support better decision-making. With a myriad of BI tools in the market delivering real-time insights on user-friendly dashboards, businesses have more power than ever when it comes to leveraging information to their advantage. We’ve rounded up a few successful ones to help you decide which tool is right for your business.

 

Qlik Sense

With the ability to easily combine your data sources and get detailed reports in an instant, Qlik has been deemed as an effective and user-friendly analytics tool by its users.

“With the Associative engine at its core, Qlik Sense lets you discover insights that query-based BI tools simply miss. Freely search and explore across all your data, instantly pivoting your analysis when new ideas surface. You’re not restricted to linear exploration within partial views of data. And you get total flexibility with a cloud-ready data analytics platform that supports the full spectrum of BI use cases – ideal for any analyst, team or global enterprise.” – Qlik.

 

Sisense

“Designed to be used by people who need to consume and analyse large amounts of data but have little or no prior experience in data crunching.” – Forbes.

An industry leader in business intelligence tools, this agile tool lets you analyse and visualise both big and disparate datasets and adapts to the needs of your business.

“Our Single-Stack™ architecture takes you from data integration to visualisation with a single BI software solution, eliminating the need to use additional tools.” – Sisense.

 

Microsoft Power BI

“It is the exact visually-appealing, dynamic, and user-friendly tool every developing company needs, and has thus brought a number of critical benefits.” – financesonline.com.

Power BI is a set of business analytics tools designed to analyse data, share insights, provide a 360-degree view of important metrics available on all devices, receive real-time updates and provide hundreds of connections to popular business apps.

“Power BI can unify all of your organisation’s data, whether in the cloud or on-premises. Using the Power BI gateways, you can connect SQL Server databases, Analysis Services models, and many other data sources to your same dashboards in Power BI. If you already have reporting portals or applications, embed Power BI reports and dashboards for a unified experience.” – Microsoft Power BI.

 

#2 – Digitisation Of Business Processes For Operational Efficiency & Customer Retention

“Spoiled by user experiences on Google and Amazon, people are increasingly demanding enhanced digital access to their records, as well as instantaneous access to the services they’re buying. This increases the pressure on traditional companies and leaves them vulnerable to disruption.” – Sharon Fisher: Content Strategist at The Economist Group.

Digitisation of people and processes is the future of business. The end-to-end customer experience design of your business can make or break your competitive edge. As demands and expectations grow, automation and optimisation become key to customer retention and organisational productivity.

“Intuitive interfaces, around-the-clock availability, real-time fulfillment, personalised treatment, global consistency, and zero errors—this is the world to which customers have become increasingly accustomed. It’s more than a superior user experience, however; when companies get it right, they can also offer more competitive prices because of lower costs, better operational controls, and less risk.” – McKinsey & Company.

Using Big Data analytics to implement automated operational strategies into your business model can be both a cost and time effective strategy, as well as an enabler for revenue growth.

“Automation gives fast growing companies the tools to keep up, but the how-to-get-there can seem like a daunting task. Any successful owner, founder, or CEO knows you have to plan for growth. That plan should include finding the right technology that can scale with your business — and automation must be integral to that plan.” – Salesforce.

 

 

#3 – Innovation & Growth Using Big Data Analytics Powered By Cloud Computing

“Whether making the decision to move to the cloud is instigated by economics or the ever-increasing speed of business, organisations need to get data-driven faster, and turning to the Cloud sooner rather than later may just be the answer.” – Dataversity.

Companies who maximise their use of analytics have a faster rate of growth and are in a stronger position to innovate than those who don’t. Using the cloud as a platform for speed, scale, customer engagement and innovation has increased the performance of the companies below.

 

Atlassian – “Aussie startups are thriving thanks to cloud technology services. Atlassian, a company that sells $100m worth of software to 130 different countries per year is an Australian startup success story. Atlassian has grown from a tech startup making clever use of cloud technologies, to an internationally renowned, billion-dollar company.” – Amazon Web Services.

Founded in 2002, Atlassian is a software company with various collaboration tools used by enterprises and startups worldwide.

“Atlassian uses AWS to scale its issue-tracking software applications faster than before, provide improved services to tens of thousands of global customers, and enhance its disaster recovery and availability. The Australia-based organisation provides software that helps developers, project managers, and content managers collaborate better. Atlassian uses Amazon EFS to support customers deploying JIRA Data Center on AWS, and also runs an internal issue-tracking application platform on AWS.” – Amazon Web Services.

 

Pearson – Founded in 1998, Pearson is a global online education provider that offers learning resources to a wide range of people, from preK-12 education and higher education to industry professionals.

“Pearson is using the cloud to transform the way it delivers education worldwide. The cloud is enabling Pearson to establish a more flexible global hybrid infrastructure with common systems and processes, which frees up resources to invest in new, more web-oriented educational products that deliver measurable outcomes for learners. This is part of an enterprise-wide business transformation that will help accelerate the company’s shift towards fast-growing markets — like South Africa and China — and educational products that are increasingly digital in nature.” – Forbes.

 

Judo Capital – “Working with cloud based services and capabilities, provided by Itoc, has enabled us to remain focused on our true mission, while achieving our vision of an IT-less future.” – Graham Dickens: Chief Technology Officer at Judo Capital.

Judo Capital, built by a small group of highly experienced bankers, is a specialist financier designed to address the financial needs of Australian SMEs. Using Itoc, a provider of a range of cloud and DevOps services, they have been able to leverage growth through better decision-making.

“Designed and built from the ground up in just 6 months, the Judo team and their technology partners have created a new breed of platform, a true ecosystem in the cloud that supports real time effective distribution of information, transparent communication and decision making. The result of which empowers Judo bankers and brokers to deliver an unrivalled service and provide customers with the opportunity to gain insight and transparency into the renowned ‘dark art’ that is today’s customer experience of SME lending.” – Richard Steven: CEO of Itoc.

 

 

For more resources, please see below:

Big Data, Huge Opportunities

Big Data & Advanced Analytics

How To Digitise Your Business In Simple Steps

Accelerating The Digitisation Of Business Processes

Why Automation Is Essential To Your Business Growth

Four Ways To Innovate Using Big Data And Analytics

Time To Digitise Business Processes, McKinsey Says

Business Transformation: How Big Data Analytics Helps

8 Ways You Can Grow Your Business Using Data Science

Four Reasons Why Big Data Analytics In The Cloud Makes Sense Now

Business Intelligence, Data Transformation And Better Decision Making

Using Rapid Process Digitisation To Transform The Customer Experience

The Importance Of Big Data and Analytics In The Era Of Digital Transformation

How Digital Disrupts Operations, Business Processes And Customer Experience

Seven Business Process Automation Benefits That Make Your Company More Money

 

Business Intelligence Tools

Sisense

Qlik Sense

Microsoft Power BI

15 Business Intelligence Tools For Small And Big Businesses

 

Businesses Leveraging Cloud Computing

Itoc

Pearson

Atlassian

Judo Capital

Amazon Web Services

Case Study: Unleashing The Potential Of Australian Businesses

The Advantages Of Cloud Computing For Startups

Three Companies That Transformed Their Businesses Using Cloud Computing

Key Players In Automation & Artificial Intelligence

“Innovations in digitisation, analytics, artificial intelligence, and automation are creating performance and productivity opportunities for business and the economy.” – McKinsey & Company.

With the rise of artificial intelligence and automation, we’ve seen a huge shift in how many jobs are being done in industries like agriculture, logistics, manufacturing and much more. As technology continues to advance at a rapid place, the number of machines performing data analysis and cognitive tasks are multiplying.

We’ve rounded up a few of the most popular automation and artificial intelligence platforms today.

 

#1 – DeepMind Technologies

Created to push boundaries, the founders behind DeepMind, a world leader in AI research, believe that this will be one of the most beneficial scientific advances ever made. Acquired by Google in 2014 and backed by investors like Elon Musk, Peter Thiel and Li Ka-shing, the company’s mission is to ‘solve intelligence.’

“I think we’re going to need artificial assistance to make the breakthroughs that society wants,” Hassabis says. “Climate, economics, disease — they’re just tremendously complicated interacting systems. It’s just hard for humans to analyse all that data and make sense of it. And we might have to confront the possibility that there’s a limit to what human experts might understand. AI-assisted science will help the discovery process.” – Demis Hassabis: Founder & CEO of DeepMind.

 

#2 – IBM Automation With Watson

With Watson, companies are able to get actionable insights through the combination of automation and analytics. It claims to deliver more value to customers and make your employees more productive by delivering a better balance between cost and performance.

“IBM Automation With Watson has the capability to understand natural language, think, learn and get smarter over time. This level of automation involves more than just replacing redundant tasks with software, It’s capabilities that are enabled by analytics, cloud, mobile and cognitive computing.” – IBM.

 

#3 – Amazon Echo

This artificially intelligent bluetooth speaker can make your house a whole lot smarter. Now available for purchase to the public, this voice- controlled assistant is being called ‘the future of home automation.’

“Amazon Echo is a hands-free speaker controlled with your voice. It features a personal assistant called Alexa, who will perform various tasks for you and control various systems. There are seven microphones within Echo, all of which feature enhanced noise cancellation and far field voice recognition, meaning you can ask Alexa a question from any direction, even when playing music, and she should still hear you.” – Britta O’Boyle: Features Editor at Pocket-lint.

Got any questions about AI & Machine Learning? Check out Context’s partnership with Amazon Web Services.

 

#4 – Google Home

Google Home, powered by Google Assistant, launched in Australia earlier this year as Amazon Echo’s rival in the home automation game; But which voice assistant you prefer is based on your priorities, what services you’re already subscribed to and whether or not they would be compatible with the device.

“While Amazon may have a head start, Google’s been doing AI and voice commands for years, so both devices are pretty powerful already. Of course, Amazon has already proven that it will add new updates to the Echo regularly, but we’ll have to wait and see if Google will keep up that same pace.” – Eric Ravenscraft: Writer at Lifehacker Australia.

 

 

 

For more resources, please see below:

Google Home

DeepMind: Inside Google’s Super-Brain

IBM Shaping The Future Of Cognitive Automation

What’s Now And Next In Analytics, AI & Automation

The Age Of Analytics: Competing In A Data-Driven World

IBM Watson takes on IT Services With New Automation Platform

Amazon Echo Is The First Artificial Intelligence You’ll Want At Home

Smart Home Assistant Showdown: Amazon Echo Vs. Google Home

Amazon Echo: What Can Alexa Do & What Services Are Compatible?

Amazon Echo Vs. Google Home: Which Voice Controlled Speaker Is Best For You?

Career Opportunity For Linux Administrators At Contexti | Big Data Analytics

Location: Sydney, Australia

 

ABOUT US

Contexti is a specialist Big Data Analytics Solutions company serving the Australian market. With strong partnerships with AWS, Cloudera, Talend and Mesosphere, we provide training, consulting and managed services for some Australia’s leading enterprises where data is at the heart of their business transformation.

We are a small but growing Sydney based team; most work is performed from our CBD office location, but occasionally we will work from customer sites. We prioritise cultivation of positive relationships with customers and seek to provide a good team atmosphere with a healthy work-life balance. While not for everyone, a significant portion of us are also on a quest to identify and consume the best Ramen and Laksa Sydney has to offer.

 

THE OPPORTUNITY

We’re looking for an experienced Linux Administrator who will be responsible for maintaining, designing, implementing, and monitoring predominantly cloud based systems. Collaboration with other team members is essential as we continue to develop automation strategies and deployment processes. You will become an integral part of the team, taking ownership of implementation work as well as working ad hoc problems through to resolution.

Your responsibilities:

  • Implement, maintain and support solutions that enable positive outcomes for our customers.
  • Take on complex integration problems making diverse application components work together.
  • Help tune performance and ensure high availability of infrastructure.
  • Design and develop infrastructure monitoring and reporting tools.
  • Develop and maintain configuration management solutions.
  • Develop test automation frameworks in collaboration with rest of the team.
  • Create tools to help teams make the most out of the available infrastructure.

 

ABOUT YOU

The person:

  • Looking for purpose in your work and the company you work for
  • You’re ready to learn, grow and contribute
  • You love to help customers and team members

 

You have the following skills:

  • Experience with Linux servers in virtualized environments.
  • Familiarity with the fundamentals of Linux scripting languages including Bash and Python.
  • Experience installing, configuring, and maintaining services such as Apache Httpd, MySQL, nginx, etc.
  • Exposure to Kerberos, and better still experience integrating Linux with Active Directory, e.g. SSSD, Centrify, etc.
  • Basic knowledge of configuration management tools (e.g as Ansible, Puppet and Chef).
  • Exposure to infrastructure versioning, provisioning and continuous integration tools (e.g Jenkins, HashiCorp TerraForm/Vagrant/Packer).
  • Familiarity with load balancing (e.g. HAProxy), firewalls, etc.
  • Experience with containerisation technologies, such as Docker, Apache Mesos, DC/OS, etc.
  • Experience with prominent core AWS services (VPN, VPC, EC2, EBS, S3, IAM, CloudWatch etc).
  • Hadoop experience would be an added bonus, but is not a requirement – you can learn this from us.
  • Most likely you will have a Computer Science Degree, you will certainly have relevant industry experience.

 

HOW TO APPLY

  • You must hold the right to work in Australia to be considered for this role.
  • Please send your resume and short cover note to jobs@contexti.com

Note to Recruiters – We will be filling these roles directly.

For more resources, please see below:

How To Succeed On Your Big Data Journey
5 Tips On How To Land A Big Data Job In Australia
Data & Analytics Australian Recruitment Market Insights By FutureYou

 

3 Strategies For Getting The Most Value From Your Data Lake

“Big Data’ and ‘data lake’ only have meaning to an organisation’s vision when they solve business problems by enabling data democratisation, re-use, exploration, and analytics.” – Carlos Maroto: Technical Manager at Search Technologies.

A data lake is a storage repository that acts as the central source of all your organisation’s current and historical data, both structured and unstructured. This data is transformed as it moves through the pipeline for things such as analysis, creating quarterly and annual reports, machine learning and data visualisation. The information contained in a data lake can be highly valuable asset, however, without the right structure, your data lake could turn into a data swamp.

Here’s three strategies for getting the most value from your data lake.

 

#1 – BUSINESS STRATEGY & TECHNOLOGY ALIGNMENT

“It’s important to align goals for your data lake with the business strategy of the organisation you’re working to support.” – Bizcubed.

What are the business goals you’re trying to achieve with your data lake? Operational efficiency? Better understanding of your customers? Will your current infrastructure help you achieve this while also maximising your profits? Aligning your goals with the technology you’re planning to implement will not only help you articulate what problem you’re trying to solve, but also improve your chances of gaining executive buy-in and winning the support of your team. The better the plan, the easier it is to identify possible roadblocks and the higher the chance of success.

“As technology teams continue to be influenced by the hype and disruption of Big Data, most fail to step back and understand where and how it can be of maximum business value. Such radically disruptive new business processes can’t be implemented without knowledge gathering and understanding how Big Data technology can become a catalyst for organisation and cultural change.” – Thierry Roullier: Director of Product Management at Infogix, Inc.

 

#2 – INTEGRATION & ARCHITECTURE

“You need to be able to integrate your data lake with external tools that are part of your enterprise-wide data view. Only then will you be able to build a data lake that is open, extensible, and easy to integrate into your other business-critical platforms.” – O’Reilly.

Technology is moving at a rapid place.The tools you use in your business may not cooperate well with your data lake, and may not support the data architectures of tomorrow. During the implementation process, one of the first things to look at is how adaptable your long-term technology investments are.

Big Data architectures are constantly evolving, and it’s important to select flexible data processing engines and tools that can handle changes to security, governance and structure without being too costly to the organisation. Before implementing anything, you need to have a clear vision of what you want the end technical platform to look like, and what components you will need to make that happen.

“Modern data onboarding is more than connecting and loading. The key is to enable and establish repeatable processes that simplify the process of getting data into the data lake, regardless of data type, data source or complexity – while maintaining an appropriate level of governance.” – Bizcubed.

 

#3 – DATA VIRTUALISATION & DEMOCRATISATION

“ Data virtualisation involves abstracting, transforming, federating and delivering data from disparate sources. The main goal of data virtualisation technology is to provide a single point of access to the data by aggregating it from a wide range of data sources.” – TechTarget.

Data lakes and data virtualisation tools work well together to solve different problems and provide a layer of intelligence that results in more agility and adaptability to change.

“ As an example, a virtual layer can be used to combine data from the data lake (where heavy processing of large datasets is pushed down) with golden records from the MDM that are more sensitive to stale copies. The advance optimisers of modern data virtualisation tools like Denodo make sure that processing is done where it is more convenient, leveraging existing hardware and processing power in a transparent way for the end user. Security and governance in the virtual layer also add significant value to the combined solution.” – datavirtualizationblog.com.

Data democratisation is the ability for information in a digital format to be accessible to the average end user. The goal of data democratisation is to allow non-specialists to be able to gather and analyse data without requiring outside help.

“Data must be freed from its silos. Today, it resides in a variety of independent business functions, such as HR, manufacturing, supply chain logistics, sales order management and marketing. To get a unified view of this data, businesses are engaging in a variety of ad-hoc, highly labor-intensive processes.” – Computer Weekly.

 

For more resources, please see below:

Best Practices For Data Lakes

How To Build A Successful Data Lake

Five Keys To Creating A Killer Data Lake

Avoiding The Swamp: Data Virtualisation & Data Lakes

Democratising Enterprise Data Access: A Data Lake Pattern

How To Successfully Implement A Big Data/ Data Lake Project

Top Five Differences Between Data Lakes & Data Warehouses

 

Cyber Security Strengthened By Big Data Analytics & Machine Learning

Information is the most valuable asset, which is why everyone is recognising the importance of data in business and the economy. But our heavy reliance on information to make decisions requires an understanding of how to protect it.

With increasing data causing new cyber threats to surface daily, data practitioners who are utilising preventative technologies to bridge the security gap are at a competitive advantage when it comes to gaining the trust of their clients. Digital innovation enabled by data and analytics has taken the world by storm and is present in our everyday lives, even on our wrists. With wearable technology and mobile devices collecting a vast amount of information about us, it’s no surprise that security and privacy have become primary concerns.

“The sophistication, ferocity, and scope of attacks have also increased. We’ve moved beyond merely defending against criminals. We’re now fighting back against nation states, organised crime, and a troubling new trend: criminal organisations hacking on behalf of rogue nations.” – TechRepublic

To combat this threat, the use of analytics and machine learning are really adding value to businesses looking to build up their defences.

“Big Data and analytics is showing promise with improving cyber security. 90% of respondents from MeriTalk’s new U.S. government survey said they’ve seen a decline in security breaches.” – SentinelOne.

 

DETECTING & PREVENTING CYBER THREATS

“It’s data that’s getting stolen, but it’s also data that can come to the rescue. You just have to know how to use it in the right way.” – Susan O’Brien: Vice President of Marketing at Datameer.

According to the 2016 Big Data Cybersecurity Analytics Research Report, 72 percent of respondents said that Big Data Analytics played an important role in detecting advanced cyber threats.

Here’s some examples of how businesses can use Big Data Analytics to detect and prevent cyber attacks.

 

#1 – USING HISTORICAL DATA

With worldwide data reaching unprecedented levels, new cyber threats are emerging daily. To combat this, an article in CSO discusses the benefits of using historical data to identify potential cyber attacks while also predicting future events.

“Using this historical data, you can create statistical baselines to identify what is ‘normal’. You will then be able to determine when the data deviates from the norm. This historical data can also create new possibilities for predictive models, statistical models, and machine learning.”

 

#2 – MONITORING EMPLOYEE ACTIVITY

“Employing a system monitoring program where the HR person or compliance officer can replay the behavior of an insider is invaluable.” – Kevin Prince: CEO of StratoZen.

Frequent news headlines about “inside jobs” involving data hacks and leaking of information make it hard to ignore the fact that employee-related breaches are on the rise.

By ensuring that access to sensitive information is limited only to the relevant employees, and appropriate policies and procedures are put in place to protect and monitor the use of information, organisations can prevent security breaches by staff.

“Unauthorised access is when staffers use applications to view files or change data they should not be able to touch. This usually requires another employee, such as a system administrator, to be lax with system access controls. Data theft or destruction can follow.” – Justin Kapahi: Vice President of Solutions & Security at External IT.

 

#3 – EDUCATING YOUR TEAM

Although it’s crucial to take the right security measures, educating your team on how to recognise potential threats is just as important. Cyber criminals are targeting employees in many ways including text, email, phone calls, fake websites and dangerous links that could give hackers possession of an organisation’s most confidential information.

“Hackers routinely target workers who are dangerously oblivious to proper cybersecurity practices. Managers who care about protecting their clients, their firms and themselves must prioritize educating employees of all levels on how breaches occur.” – Tech Center.

 

#4 – DEPLOYING AN INTRUSION DETECTION SYSTEM

Data encryption, multi-factor authentication and firewalls are all common security measures, but another important precaution to take is deploying an Intrusion Detection System (IDS).

“IDS provides an umbrella to the network by monitoring all traffic on specific segments that may contain malicious traffic or have mal-intent. The sole function of a network-based IDS is to monitor the traffic of that network.” – TechTarget.

When deploying an Intrusion Detection System, It’s important to understand the requirements of your business in order to select the one most suitable one for the company’s infrastructure.

“Intrusion detection and prevention should be used for all mission-critical systems and systems that are accessible via the Internet, such as Web servers, e-mail systems, servers that house customer or employee data, active directory server, or other systems that are deemed mission critical.” – IT Business Edge.

 

For more resources, please see below:

8 Ways To Prevent Data Breaches

How Big Data Is Improving Cyber Security

Your Biggest Cyber Security Threat? Your Employees

Hacker Hunting: Combatting Cybercrooks With Big Data

Intrusion Detection System Deployment Recommendations

Challenges to Cyber Security & How Big Data Analytics Can Help

Big Data & Machine Learning: A Perfect Pair For Cyber Security?

Healthcare, Cybersecurity & Innovation In The Wearable Technology Market

Big Data Analytics Strengthen Cybersecurity Postures, Reveals Ponemon Institute Report

Five Tips For Data Efficiency

At Contexti, we’re always looking for new ways to make it easier to work with data.

When it comes to Big Data projects, it’s all about efficiency. We’ve rounded up the five best tips on how to make it happen.

 

#1 – DATA COMPRESSION

This can be a great way to reduce repetitive information, have shorter transition times and free up some storage space. The process of encoding data more efficiently to achieve a reduction in file size can happen in two ways: lossless and lossy compression.

“Lossless compression algorithms use statistic modeling techniques to reduce repetitive information in a file. Some of the methods may include removal of spacing characters, representing a string of repeated characters with a single character or replacing recurring characters with smaller bit sequences.” – Conrad Chung: Customer Service & Support Specialist at 2BrightSparks.

The great thing about lossless compression is that no data is lost during the compression process. With lossy compression, data such as multimedia files for images and music can be discarded. Lossy compression on the other hand, works very differently.

“These programs simply eliminate ‘unnecessary’ bits of information, tailoring the file so that it is smaller. This type of compression is used a lot for reducing the file size of bitmap pictures, which tend to be fairly bulky.” – Tom Harris: Contributing writer at HowStuffWorks.

 

#2 – CLOUD OPTIMISATION

“If your organisation wants to extract the highest level of application performance out of the computing platforms that it purchases, you should ensure that workloads are optimised for the hardware they run on.”- Joe Clabby: Contributor at TechTarget.

Choosing the right cloud services to achieve this requires consideration of efficiency, performance and cost advantage. A great tool for workload optimisation is the Cloudera Navigator Optimizer for Hadoop-based platforms.

“Cloudera Navigator Optimizer gives you the insights and risk-assessments you need to build out a comprehensive strategy for Hadoop success.” – Cloudera Inc.

Not only does it reduce risk and provide usage visibility, it’s also flexible and keeps up with changes in demand. “Simply upload your existing SQL workloads to get started, and Navigator Optimizer will identify relative risks and development costs for offloading these to Hadoop based on compatibility and complexity.”

 

#3 – UNIFIED STORAGE ARCHITECTURE

Many enterprises experience the same dilemma: unified storage system or traditional file/block storage system?

Randy Kerns, Senior Strategist & Analyst at Evaluator Group describes unified storage as “ A system that can do both block and file in the same system. It will meet the demands for applications that require block access, plus all of the file-based applications and typical user home directories you have.”

With the ability to simplify deployment and manage systems from multiple vendors, unified storage architecture is growing in popularity among storage administrators who are quickly seeing the benefits of the distributed access and centralised control it provides.

An article in TechTarget highlights the key benefits of running and managing files and applications from a single device: “One advantage of unified storage is reduced hardware requirements. Unified storage systems generally cost the same and enjoy the same level of reliability as dedicated file or block storage systems. Users can also benefit from advanced features such as storage snapshots and replication.”

 

#4 – DEDUPLICATION

“Deduplication is touted as one of the best ways to manage today’s explosive data growth.” – Brien Posey: Technology Author at TechRepublic.

Data deduplication is a technique of eliminating redundant or duplicate data in a data set and as a result, maximising storage savings and increasing the speed and efficiency at which data is processed.
By reducing the amount of storage space an organization needs to save its data, you’re not only saving time and money, but you’re preserving the integrity and security of of your data. “The simple truth is that to be effectively managed, adequately protected and completely recovered, your data size must be shrunk.” – Christophe Bertrand: VP of Product Marketing at Arcserve.

Here’s how it works: “Each chunk of data (e.g., a file, block or bits) is processed using a hash algorithm, generating a unique number for each piece. The resulting hash number is then compared to an index of other existing hash numbers. If that hash number is already in the index, the data does not need to be stored again. Otherwise, the new hash number is added to the index and the new data is stored.” – TechTarget.

 

#5 – CROSS-CHANNEL ANALYTICS

“Cross-channel analytics is a where multiple sets of data from different channels are linked together and analyzed in order to provide customer and marketing intelligence that the business can use. This can provide insights into which paths the customer takes to conversion or to actually buy the product or avail of the service. This then allows for proper and informed decision making to be made.” – Techopedia.

Among the many benefits of this process are understanding the impact of each channel, how they work together and determining which channel combinations get the highest results and conversions. It’s an efficient system that generates insights useful to each department within your organisation.

“Business leaders can use this information to design better process flows for customers by creating or revising customer journey maps. Meanwhile, marketers can use behavioral data from customer interactions in different channels for other purposes.” – TIBCO Blog.

 

For more resources, please see below:

 

Data Efficiency

What Are The Data Efficiency Technologies? – Performance: The Key To Data Efficiency

 

Data Compression

How File Compression Works

How Big Is Your Data, Really?

The Basic Principles of Data Compression

Data Compression: Advantages and Disadvantages

 

Cloud Optimisation

Cloudera Navigator Optimiser

Application Performance Tips: Workload Optimisation and Software Pathing

 

Unified Storage Architecture

Advantages of Using Unified Storage!

Unified Storage (Multiprotocol Storage)

Unified Storage Architecture Explained

Unified Storage Architecture: The Path To Reducing Long-Term Infrastructure Costs

 

Data Deduplication

What Is Data Deduplication?

How Data Deduplication Works

10 Things You Should Know About Data Deduplication

The ABCs Of Data Deduplication: Demystifying The Different Methods

Understanding Data Deduplication – And Why It’s Critical For Moving Data To The Cloud

 

Cross-Channel Analytics

What Is Cross-Channel Analytics?

Big Data Analytics: The Key To Understanding The Cross-Channel Customer