Sílvia Raposo

Advanced Analytics: elevar a análise de dados a um outro nível

Implementar um modelo de Business Intelligence é muito mais que recolher dados, trata-se sobretudo de converter esses grandes dados em insights valiosos que irão acrescentar valor ao negócio. No entanto, se não houver um modelo que permite analisar e compreender os dados que vão surgindo, tudo o que irá existir serão números sem contexto e sem valor acrescentado.

Para fazer uma análise de dados correta é necessário ter em atenção que não existe apenas um método válido de análise; este processo está dependente das necessidades em questão e do tipo de dados recolhido, afim de aplicar a metodologia correta de análise.

Existem, no entanto, métodos comuns de advanced analytics que permitem tornar os dados em valor acrescentado, mesmo quando à partida não existem regras de negócio bem definidas, transformando um aglomerado de dados em insights relevantes para o negócio, que permite a tomada de decisões fundamentada.

Dados quantitativos e dados qualitativos

Antes de abordar os vários métodos, é necessário entender que tipo de dados que vão ser analisados. Para dados quantitativos, o foco está na quantidade de números em bruto, como o próprio nome indica. Deste tipo são exemplos os dados de sales, de marketing, de payroll, receita e despesa, etc. No fundo, refere-se a dados que são quantificáveis e passíveis de ser medidos objetivamente.

Dados qualitativos são, por sua vez, mais dificeis de interpretar à partida, uma vez que são dados não estruturados, mais subjetivos e de natureza interpretativa. A este campo pertencem exemplos como informação recolhida de sondagens, entrevistas aos empregados, questionários de satisfação, etc.

Medir dados quantitativos

Relativamente à análise de dados quantitativos, existem quatro métodos que vão elevar essa análise a um novo nível.

  1. Análise de Regressão

A escolha sobre qual o melhor tipo de estatística irá depender sempre do objectivo da pesquisa.

A análise de regressão permite modelar a relação entre uma variável dependente e uma ou mais variáveis independentes. Em data mining, esta técnica é utilizada para prever valores num dataset particular. Por exemplo, pode ser utilizada para prever o preço de um produto, tendo em conta outras variáveis. Pode também ser utilizada para identificar tendências e corelações entre fatores.

A regressão é um dos metódos de análises de dados mais usados no mercado para efeitos de gestão, planeamento de marketing, previsões financeiras, entre outros.

  1. Teste de hipóteses / teste de significância

Este método, também chamado de t testing, permite inferir se uma determinada premissa é verdadeira para o data set. Na análise de dados e em estatística, será considerado o resultado de uma hipótese que seja estatisticamente significativa, cujo resultado não possa ter decorrido de um acaso aleatório. Este procedimento infere sobre quantidades de interesse de uma população a partir de uma amostra observada, utilizando a teoria da probabilidade.

  1. Simulação Monte Carlo

Um dos métodos mais populares para calcular o efeito de variáveis imprevisiveis de um factor específico são as simulações de Monte Carlo que utilizam a modelagem de probabilidade para ajudar a prever o risco e a incerteza. Para testar um cenário ou hipótese, esta simulação recorre a números aleatórios e dados para simular uma variedade de outcomes possíveis. Esta ferramenta é frequentemente utilizada nas áreas de project management, finanças, engenharia e logística, entre outras.  Ao testar uma variedade de hipóteses, é possível concluir como é que uma série de variáveis aleatórias podem afetar os planos e projetos.

  1. Redes Neurais Artificiais

Este modelo computacional replica o sistema nervoso central de um humano ( neste caso, o cérebro), permitindo à máquina aprender através da observação dos dados, o chamado machine learning. Este processamento de informação replica as redes neurais, utilizando um modelo de inspiração biológica para processar a informação e aprender com análise, ao mesmo tempo que consegue fazer previsões. Neste modelo, os algoritmos partem de inputs amostrais e aplicam o raciocinio indutivo – extraindo regras e padrões de grandes conjuntos de dados.

Sílvia RaposoAdvanced Analytics: elevar a análise de dados a um outro nível
read more

Advanced Analytics: learn how to elevate data analysis to a whole new level

Implementing a business intelligence model requires more than just gathering data; overall, it’s really about converting big data and valuable insights to add value to the business. However, if there’s no model available that allows you to analyse and understand this incoming data, all you’ll get is meaningless numbers with no added value.

In order to perform a correct data analysis, it is necessary to understand that there’s no unique valid method of analysis; the process depends on needs and requirements and the type of data collected in order to determine the most suitable analysis methodology.

However, there are some methods common to most advanced analytics that are capable of turning data into added valu, even when there aren’t established business rules, transforming data agglomerates into relevant insights, beneficial to the business and enabling well-founded decision-making.

Quantitative data and qualitative data

Before covering the various methods, let’s identify the precise type of data you want to analyse. For quantitative data, the focus is on raw number quantity, as the name suggests. Examples of this type of data include sales figures, marketing data, payroll data, revenue and expenses, etc. Basically, all the figures that are quantifiable and objectively measured.

Qualitative data, on the other hand, is fundamentally harder to interpret, considering its lack of structure, more subjective and of an interpretive nature. At this end of the spectrum you can find examples such as collected information from surveys or polls, employee interviews, customer satisfaction questionnaires and so on.

Measuring quantitative data

Looking at the analysis of quantitative data, there are four methods capable of taking that very same analysis to the next level.

  1. Regression analysis

The choice of the best type of statistics will always depend on the main goal of the research.

Regression analysis is capable of modelling the correlation between a dependent variable and one or more independent variables. In data mining, this technique is implemented to predict values on a particular dataset. For example, it can be used to foresee the price of a certain product, while considering other variables. It can also be useful to identify trends and correlations between different factors.

Regression is one of the commonest methods of data analysis in the market for management purposes, marketing planning, financial forecast and much more.

  1. Hypothesis testing/significance testing

This method, also called “T-testing”, is capable of determining if a certain premise is true for the relevant dataset. In data analysis and statistics, only a statistically significant result would be considered from a certain hypothesis, resultant of a non-random occurrence. This procedure makes predictions regarding a certain quantity of interests present in a certain population, from a studied sample, using the theory of probability.

  1. Monte Carlo simulation

One of the most popular methods for calculating the effect of unpredictable variables from a specific factor involves Monte Carlo simulations, using probability modelling to defend against risk and uncertainty. To test a scenario or hypothesis, this simulation uses random numbers and data to simulate a variety of possible outcomes. This tool is frequently used for project management, finance, engineering and logistics, amongst other areas. By testing a wide variety of hypothesis, it is possible do discover how a series of random variables can affect plans and projects.

  1. Artificial neural networks

This computational model replicates the human central nervous system (in this case, the brain), allowing the machine to learn by observing data (so-called ‘machine learning’). This type of information processing replicates the neural networks, using a model of biological inspiration to process information and learn through analysis, simultaneously performing predictions. In this model, the algorithms are based on sample inputs, while applying inductive reasoning – extracting rules and patterns from large sets of data.

Sílvia RaposoAdvanced Analytics: learn how to elevate data analysis to a whole new level
read more

Web content management

What is it for, what the advantages are, and what technologies are currently trending

A web content management system (WCMS) is the term used to describe a CMS (content management system), which is a set of tools for managing digital information stored on a website that also allows the user to create and manage content without any knowledge of programming or markup languages such as XML. WCMS is a program that helps users to maintain, control, change and adjust the content on a webpage.

WCMS behaves similarly to a traditional content management system – managing the integrity, editing and information lifecycle – but is specifically designed for handling web content.

The typical functionality of a WCMS system might include the ability to create and store personalised content on the website, with editors being able to review and approve content before it is published and configure an automated publication process. There is an increasingly greater need for such platforms to provide both creative options and accessibility, not just for content, but covering the entire user experience – solutions that manage the uploaded content and facilitate the monitoring of the entire user journey – regardless of the channel being used.

Pros and cons

There are several elements to consider when using a WCMS.

On the one hand, WCMS platforms are usually inexpensive and intuitive to use, as they don’t require technical programming expertise in order to manage and create content. The WCMS workflow can also be personalised by creating several accounts to manage different profiles.

On the other hand, WCMS implementations can sometimes be extremely costly, demanding specific training or certification. Maintenance can also incur extra expense, for licensing upgrades or updates. Security can also be a concern, given that in the event of a safety threat, hackers might explore vulnerabilities which could potentially damage user perceptions of the brand.

Choosing the right WCMS solution

With a WCMS, the content is predominantly stored in a database and grouped using the help of a flexible language such as XML or .Net.

There are several options using open-source WCMS, such as WordPress, Drupal and Joomla for more generic functions. But there are also solutions that cater to specific needs, such as, for example, the Marketing 360 platform, Filestack and CleanPix.

And there are the commercial solutions currently on the market, such as Sitecore, a single platform that comprises several WCMS components, Content Personalization, Content Marketing, Digital Asset Management and E-Commerce. This is one of the major advantages of this platform, as instead of acquiring and integrating the different components that will consume content and information from an adjacent system, in Sitecore’s case, contact data and information and interactions performed through the different channels are already available in the platform, ready to be used and processed by different functions and for different purposes: creating campaigns, sending emails, creating marketing workflows and customisation rules, among others.

WCMS solutions provide different functionalities, with several levels of depth and specific purposes. Before selecting the platform, consider the following functionalities:

  • Configuration: ability to activate and deactivate functionality using specific parameters.
  • Access management: managing users, permissions and groups.
  • Extension: the capacity to install and configure new functionalities and/or connectors.
  • The ability to install models with new functionalities
  • Customisation: ability to change specifications to customise some features, through toolkits or interfaces.
  • WYSIWYG: capacity to provide a “What you see is what you get” mechanism, allowing content managers to know, while making alterations, what the users will see after launching a new version of the content. A good example of this is provided on Sitecore’s “Experience Editor”
  • Integration: ability to integrate the WCM solution with other previously installed solutions, or with external solutions in order to gather information from both ends; for example, integration with Microsoft CRM Dynamics 365 or Microsoft SharePoint.
  • Flows: capacity to incorporate a flow configuration mechanism for content approval and alteration, from different content creators with different profiles, plus content publishing.
  • User experience: editing is less complex, with built-in templates that add a predetermined functionality to the page, with no additional training needed.
  • Technical assistance and updates: consider the degree of technical support you will receive, as well as the level of accessibility for making system updates.

The advantages of WCMS

A major advantage of WCMS is the fact that the software solution gives you consistent control over the look and feel of the website – brand, wire frames, navigation – simultaneously granting the functionality to create, edit and publish content – articles, photo galleries, video, etc. WCMS can be the best solution for companies looking for a rich content repository, focused on brand consistency.

Other advantages:

  • Automated templates;
  • Controlled access to the page;
  • Scalability;
  • Tools that allow simple editing, via WYSIWYG solutions;
  • Regular software updates;
  • Workflow management;
  • Collaboration tools that provides users with permission to modify the content;
  • Document management;
  • Ability to publish content in several languages;
  • Ability to retrieve older editions;
  • Ability to analyse content across devices (desktop, mobile, tablet, watch).
  • Omnichannel content availability.

Our vision

Content management is a relevant topic, although not recent. However, a topic that gained a lot of traction during recent years is the capacity to use customised content, offering a relevant experience to all users. In order to achieve this goal, Xpand IT decided to go into partnership with Sitecore, because we believe it to be the best platform for addressing customisation challenges, benefiting from the aforementioned advantages and also exploiting the fact that Sitecore allows Headless implementations (separating the entire content from the presentation layer), as well as integration with mobile platforms (producing true omnichannel solutions). We are certain that this technology has a lot to offer and we are excitedly looking forward to implementing new functionalities, which will be available soon and launched with the intent of fulfilling our vision – offering relevant and personalised content for everyone, at any time, in any place.

Sílvia RaposoWeb content management
read more

Web content management

Para que serve, quais as vantagens e que tecnologias são hoje em dia uma referência

Por Web Content Management System (WCMS), entende-se a utilização de um CMS (Content Management System), que mais não é que um conjunto de ferramentas que permite a qualquer entidade gerir a sua informação digital alojada num website, com a possibilidade de criar e gerir conteúdo sem ser necessário conhecimentos de programação ou linguagens de markup. O WCMS é um programa que ajuda a manter, controlar, mudar e a ajustar o conteúdo numa webpage.

O WCMS comporta-se de forma similar a uma gestão de conteúdos tradicional – gerindo a integridade, edições e ciclo de vida da informação – com a ressalva que esta gestão é efetuada para conteúdo especificamente designado para a web.

Alguns aspetos de destaque de qualquer WCMS são a possibilidade de criar e manter conteúdo personalizado no website; a possibilidade de editores poderem rever e aprovar conteúdo antes da publicação; e o estabelecimento de um processo de publicação automático.  A par disto, há uma necessidade cada vez maior de ter plataformas que permitem criar e disponibilizar não somente conteúdos, mas toda uma experiência de utilização – soluções que consomem os conteúdos carregados e que permitem, ao mesmo tempo, acompanhar toda a jornada do utilizador – independentemente do canal que estiver a ser utilizado.

Prós e Contras

A utilização de um WCMS tem vários aspetos que devem ser considerados.

Por um lado, as plataformas de WCMS são usualmente pouco dispendiosas e muito intuitivas em termos de utilização, não sendo, por isso, necessários conhecimentos técnicos de programação para gerir e criar conteúdo. O próprio workflow do WCMS também pode ser personalizado com a criação de várias contas para diferentes perfis.

Por outro lado, algumas implementações no WCMS podem ser algo dispendiosas ao requererem formação ou certificações. A manutenção pode também acarretar custos, requerendo upgrades ou updates de licenciamento. A segurança é também uma preocupação nestas plataformas, uma vez que, quando existem ameaças de segurança, por vezes cria-se uma vulnerabilidade que pode ser explorada por hackers, podendo potencialmente prejudicar a percepção das marcas por parte dos seus clientes.

Escolher a melhor solução de WCMS

Num WCMS o conteúdo é na sua maioria mantido numa base de dados e agrupado utilizando uma linguagem flexível, como XML ou .Net.

Existem várias opções que utilizam WCMS open-source, como o WordPress, Drupal e Joomla para funções mais genéricas; há também soluções que endereçam necessidades específicas, como por exemplo, a plataforma Marketing 360, a Filestack e a CleanPix.

Por outro lado, temos atualmente no mercado soluções comerciais como é o caso de Sitecore que agrega numa única plataforma componentes de WCMS, Personalização de Conteúdos, Marketing, Digital Asset Management e E-Commerce. Esta é uma das grandes vantagens desta plataforma que, ao invés de adquirir e integrar os diversos componentes que irão consumir conteúdos e informação a um sistema adjacente, neste caso os dados e informação de contatos e interações efetuadas através dos diversos canais já existem e encontram-se disponíveis na plataforma. Estes dados já se encontram prontos a serem utilizados e trabalhados por diferentes áreas e para diferentes propósitos: criação de campanhas, envio de emails, criação de fluxos de marketing e criação de regras de personalização, entre outros.

As soluções WCMS providenciam diferentes funcionalidades, com diversos graus de profundidade e propósitos específicos. Antes de selecionar a plataforma, considere funcionalidades como:

  • Configuração: a possibilidade de ativar e desativar funcionalidades através de parâmetros específicos para o efeito.
  • Gestão de Acessos: gestão de utilizadores, permissões e grupos.
  • Extensão: integração e capacidade de instalação e configuração de novas funcionalidades e / ou conectores.
  • Possibilidade de instalar modelos com novas funcionalidades.
  • Customização: possibilidade de alterar as especificações de forma a personalizar algumas features, através de toolkits ou interfaces
  • WYSIWYG: capacidade de disponibilizar um mecanismo de “What you see is what you get”, permitindo aos Gestores de Conteúdo perceber, no momento das alterações, o que os utilizadores irão visualizar após a disponibilização da nova versão dos conteúdos. Um bom exemplo é a funcionalidade disponível na plataforma Sitecore denominada de “Experience Editor”
  • Integração: possibilidade de integrar a solução de WCM com outras soluções já instaladas ou com outras soluções externas por forma a agregar a informação de ambas; por exemplo integração com Microsoft CRM Dynamics 365 ou Microsoft SharePoint.
  • Fluxos: capacidade de incorporar mecanismo de configuração de fluxos de aprovação da alteração de conteúdos, por diferentes autores de conteúdo com diferentes perfis, e de publicação de conteúdos.
  • User Experience: permite que a edição seja feita de forma pouco complexa, com templates built-in que adicionam uma funcionalidade pré-definida à página, sem ser necessário ter formação adicional.
  • Assistência técnica e updates: considerar o nível de apoio técnico que irá receber, assim como o nível de acessibilidade aos updates do sistema.

Vantagens de WCMS

Uma grande vantagem do WCMS é o facto de ser a solução de software que permite controlo consistente do look and feel do website- a marca, wire frames, navegação – ao mesmo tempo que permite a possibilidade de criar, editar e publicar conteúdo – artigos, galerias de fotos, vídeo, etc. No caso de empresas que possam dispor de repositório de conteúdo muito rico ou focado em brand consistency, a solução mais adequada poderá ser um WCMS.

Outras vantagens incluem:

  • Templates automatizados;
  • Aceso controlado à página;
  • Expansão escalável;
  • Ferramentas que permitam uma edição simples, através de soluções WYSIWYG
  • Updates regulares de software;
  • Gestão de workflow;
  • Ferramentas de colaboração que permitam a vários utilizadores modificar conteúdo;
  • Gestão de documentos;
  • Possibilidade de publicar conteúdo em várias línguas;
  • Possibilidade de recuperar edições mais antigas.
  • Possibilidade de analisar os conteúdos em diferentes dispositivos (desktop, mobile, tablet, watch).
  • Disponibilização de conteúdos omnicanal.

A nossa visão

A gestão de conteúdos é uma temática relevante, embora não seja recente. No entanto, o tema que tem ganho muita tracção ao longo dos últimos anos é a capacidade de utilizar conteúdos de forma personalizada, para oferecer uma experiência relevante aos utilizadores. Para concretizar esta visão, a Xpand IT estabeleceu há já algum tempo uma parceria com a Sitecore porque acreditamos que esta á a plataforma certa para endereçar os desafios da personalização, tirando partido das vantagens que já foram referidas mas também de outras como o facto de o Sitecore permitir implementações Headless (separando todo o conteúdo da camada de apresentação) bem como a integração com plataformas móveis (potenciando verdadeiras soluções omnicanal). Estamos certos que a tecnologia tem muito para oferecer ao mercado e olhamos com entusiasmo para as novas funcionalidades que serão, em breve, disponibilizadas para concretizar cada vez mais esta visão – oferecer conteúdo relevante e personalizado a qualquer pessoa, a qualquer momento e em qualquer canal.

Sílvia RaposoWeb content management
read more

Xpand IT enters the FT1000 ranking: Europe’s Fastest Growing Companies

Xpand IT proudly announces our entry into the Europe’s Fastest Growing Companies ranking, compiled by renowned international journal the Financial Times! With sustained growth surpassing 45% in 2018, Xpand IT attained a place among the fastest growing companies, along with 1000 other European enterprises, taking into account their consolidated results between 2014 and 2017.

An income of 10 million and 195 collaborators were the figures that guaranteed our place on this list. Our income has since taken the leap to 15 million, and we can now count on the tireless work of more than 245 collaborators. And so, out of the three Portuguese tech companies distinguished with a spot on the ranking, Xpand IT can boast the best results in terms of income and the acquisition of new talent.

Paulo Lopes, CEO & Senior Partner of Xpand IT, said “Having a place on the FT 1000 European ranking is the ultimate recognition for all the work we have undertaken over the last few years.  We are renowned for our know-how and expertise within the technology arena, and now also for our unique team and business culture, focused on excellency and innovation, which makes it far easier to achieve these kinds of results.”

This year’s goal is to maintain our growth trend, not just by expanding into new markets, but also by increasing our workforce. In 2019, we expect to reach the beautifully rounded number of 300 Xpanders!

Sílvia RaposoXpand IT enters the FT1000 ranking: Europe’s Fastest Growing Companies
read more

7 steps to implement a data science project

Data science is a set of methods and procedures applied to a very complex, concrete problem, in order to solve it. It can use data interference, algorithm development and technology to analyse collected data and understand certain phenomena, identifying patterns. Data scientists must be in possession of mathematical and technological knowledge, along with the right mindset to achieve the expected results.

Through the unification of various concepts, such as statistics, data analysis and machine learning, the main objective is to unravel behaviours, tendencies or interferences in specific data that would be impossible to identify via a simple analysis. The discovery of valuable insights will allow companies to make better business decisions and leverage important investments.

In this blog post, we unveil 7 important steps to facilitate the implementation of data science.

1. Defining the topic of interest / business pain-points

In order to initiate a data science project, it is vital for the company to understand what they are trying to discover. What is the problem presented to the company or what kind of objectives does the company seek to achieve? How much time can the company allocate to working on  this project? How should success be measured?

For example, Netflix uses advanced data analysis techniques to discover viewing patterns from their clients, in order to make more adequate decisions regarding what shows to offer next; meanwhile, Google uses data science algorithms to optimise the placement and demonstration of banners on display, whether for advertisement or re-targeting.

2. Obtaining the necessary data

After defining the topic of interest, the focus shifts to the collection of fundamental data to elaborate the project, sourced from available databases. There are innumerable data sources, and while the most common are relational databases, there are also various semi-structured sources of data. Another way to collect the necessary data revolves around establishing adequate connections to web APIs or collecting data directly from relevant websites with the potential for future analysis (web scrapping).

3. “Polishing” the collected data

This is the next step – and the one that comes across as more natural – because after extracting the data from their original sources, we need to filter it. This process is absolutely essential, as the analysis of data without any reference can lead to distorted results.

In some cases, the modification of data and columns will be necessary in order to confirm that no variables are missing. Therefore, one of the most important steps to consider is the combination of information originating from various sources, establishing an adequate foundation to work on, and creating an efficient workflow.

It is also extremely convenient for data scientists to possess experience and know-how in certain tools, such as Python or R, which allow them to “polish” data much more efficiently.

4. Exploring the data

When the extracted data is ready and “polished”, we can proceed with its analysis. Each data source has different characteristics, implying equally different treatments. At this point, it is crucial to create descriptive statistics and test several hypotheses – significant variables.

After testing some variables, the next step will be to transfer the obtained data into data visualisation software, in order to unveil any pattern or tendency. It is at this stage that we can include the implementation of artificial intelligence and machine learning.

5. Creating advanced analytical models

This is where the collected data is modelled, treated and analysed. It is the ideal moment to create models in order to, for example, predict future results. Basically, it is during this stage that data scientists use regression formulas and algorithms to generate predictive models and foresee values and future patterns, in order to generalise occurrences and improve the efficiency of decisions.

6. Interpreting data / gathering insights

We are nearly entering the last level for implementing a data science project. In this phase, it is necessary to interpret the defined models and discover important business insights – finding generalisations to apply to future data – and respond to or address all the questions asked at the beginning of the project.

Specifically, the purpose of a project like this is to find patterns that can help companies in their decision-making processes: whether to avoid a certain detrimental outcome or repeat actions that have reproduced manifestly positive results in the past.

7. Communicating the results

Presentation is also extremely important, as project results should be clearly outlined for the convenience of stakeholders (who, in the vast majority of instances, are without technical knowledge). The data scientist has to possess the “gift” of storytelling so that the entire process makes sense, meeting the necessary requirements to solve the company’s problem.

If you want to know more about data science projects or if you’d like a bit of advice, don’t hesitate to get in touch.

Sílvia Raposo7 steps to implement a data science project
read more

The impact of Big Data on Social Media Marketing

Social media was born with the intent to create remote connections between colleagues, friends and others who wanted to share knowledge and information. Even though this purpose is still prevalent in its genesis, the truth is that social media has been evolving exponentially throughout the years, becoming a powerful bi-directional communication tool between companies and clients.

Nowadays, social media allows companies to publicise their brand and products, facilitating the rapid growth of their client base while also allowing the ceaseless collection of inputs from their users, whether they are clients or not.

For that reason, each like, comment or share gives companies a better understanding of their clients and their respective behaviours, through the way in which they interact with specific types of content. This behavioural analysis and exchange of information generates a massive volume of data, which can only be stored and processed using “Big Data” technologies.

In reality, Big Data has impacted on almost every sector of our daily lives, shaping the way people communicate, work and even have fun.

In recent decades, the quantity of generated data has been growing exponentially, doubling its size every two years, potentially reaching 44 trillion gigabytes in the year 2020. The massification of the World Wide Web and the Internet of things abruptly increased the amount of generated data, equally intensifying the necessity to diminish the time it takes to transform and access that same data.

Big Data is the technological concept that encompasses a particular set of practices and tools, tackling this problem using 5 fundamental principles:

  • Volume (storing, processing and accessing vast amounts of data)
  • Variety (cross-referencing data from various sources)
  • Speed (data access, treatment and processing speed)
  • Veracity (guarantee the veracity of information)
  • Value (usefulness of the information processed)

This “new” data access method and processing power has established a new paradigm within the marketing sector. Now it’s easier to analyse and identify trends, as well as possible cause and effect relationships to apply to marketing strategies. These types of analyses have become indispensable to companies for increasing the percentage of messages that actually reach the target, resulting in the growth of their ROI (return on investment).

How do we take advantage of Big Data in a marketing strategy?

The first step is to establish a relation between non-structured data, provided by social media, and already available data, such as your clients’ details. After completing this step, it will be easier to observe and analyse your clients’ actions, thus collecting important insights that will form a solid base for your future campaigns.

Now you can outline marketing strategies focused on all the insights you’ve gathered. In other words, you are now able to design marketing campaigns anchored by content that fulfills the needs of your clients, or segmented groups of clients.

Execution time has arrived! Now you possess the most actionable content, based on your analyses, let’s discover the degree of effectiveness of your strategy.

You’ve almost certainly worked out that this is a fundamental formula to success, but reaching that sweet spot will require constant “fine-tuning”. In other words, from this point forward, your digital marketing strategy will work in cycle: the number of insights about your clients and the reach and suitability of your strategies and content are proportionately higher, which in turn implies more insights.

Social media marketing is a tool that allows a company of any dimension and in any market to better understand its clients and work out the most effective strategies to shape its offers in order to satisfy the needs of its clients.

The truth: without Big Data, none of this would have been possible!

Sílvia RaposoThe impact of Big Data on Social Media Marketing
read more

Machine Learning: autonomous learning

Machine Learning has been developing further every day, thanks to the digital transformation movement. The original basis was a theory that believed computers could learn to perform specific tasks and to recognise patterns. The challenge was simple: to check if computers could learn from data.

Machine Learning provides systems with the possibility to learn and improve from experience, without needing specific programming for that effect. The focus is on developing programs that use available data and can learn on their own. The mathematical models are built and powered with – potentially – large amounts of data. The algorithms learn to identify patterns and to extract insights that are applied when new information is processed. This term dates back to 1959, when the pioneer Arthur Samuel defined Machine Learning as the ability of a computer to learn without being explicitly programmed to do so.

This learning process starts with data processing and trying to identify patterns. The main goal is to allow computers to learn autonomously without the need for human assistance, using that knowledge to make decisions according to what was “learnt”. Even though machine learning algorithms have been around for a long time, the application of these mathematical calculations to Big Data, with more frequency is a recent development. However, according to industry reports, what is considered to be an exponential growth in this area today is going to be seen as only “baby steps” in 50 years. This AI field is expected to grow extremely fast in the coming years.

Examples of Machine Learning

The continuing interest in this practice stems from a few key factors that have also made data mining and Bayesian analysis extremely popular: growth in the volume and variety of available data; cheaper and more powerful computational processes; and low cost storage.

A few examples of machine learning applications in some companies include self-driving vehicles; recommendations from online platforms such as Amazon and Netflix based on users’ behaviour; voice recognition systems such as SIRI and Cortana; PayPal’s platform, which is based on machine learning algorithms to fight fraud by analysing large quantities of data from the customer and assessing risks; the model from Uber that uses algorithms to determine time of arrival and departure locations; SPAM detecting mechanisms in email accounts; facial recognition that occurs in platforms such as Facebook.

Industries that are choosing Machine Learning

Most industries with large amounts of data have already acknowledged the potential of this technology. The possibility to extract insights allows companies to obtain a competitive advantage and work more efficiently.

Financial Services

Banks and other financial entities are using machine learning with two goals: extracting valuable insights from customer data and preventing fraud. Insights identify investment opportunities according to customers’ profiles, and, concerning fraud, the identification of high-risk customers and suspect transactions is improved.

Furthermore, this technology can also influence customer satisfaction. By analysing a user’s activity, smart machines can predict, for example, a possible account closure before it happens and prompt mitigating actions.

Health

Health entities can capitalise on the integration between IoT and data analysis to develop better solutions for patients. The emergence of wearables allows acquiring data related to the patients’ health, which, in turn, allows health professionals to detect relevant patterns including risk patterns. Therefore, this technology offers the potential for better diagnosis and treatment.

Retail

Nowadays, the impact of smart machines in users’ retail experience is quite obvious. The result is a highly personalised service that includes recommendations based on purchase history or online activity; improvements in customer service and delivery systems, where machines decipher the meaning of users’ emails and delivery notes, in order to prioritise tasks and ensure customer satisfaction; and dynamic price management by identifying patterns in price fluctuations and allowing to prices to be determined according to the demand. The ability to gather, analyse and use data to personalise, for example, a purchase experience (or implement a marketing campaign) is the future of retail.

Transportation

Analysing data to identify patterns and trends is key to the transportation industry, since profit growth means more efficient routes and the projection of potential problems. Data analysis and the modelling aspects of machine learning are important tools for delivery companies and public transportation, allowing them to improve their income.

Machine learning apps allow companies to automate the analysis and interpretation of business interactions, extracting valuable insights that make personalising products and services, possible.

Xpand IT has a complete service portfolio in Machine Learning. If you want to know how to use Machine Learning in your business and obtain real added value, we can help. Do you want to know how we can help your business? Contacts us here and get the best out of this technology!

Sílvia RaposoMachine Learning: autonomous learning
read more

Machine Learning: aprender de forma autónoma

Com o movimento de transformação digital, o ramo de Machine Learning foi ganhando cada vez mais tração. Primeiramente assente numa teoria que advogava que os computadores poderiam aprender a efetuar tarefas específicas e a reconhecer padrões, o desafio colocado era simples: verificar se os computadores poderiam aprender com os dados.

Machine Learning proporciona aos sistemas a possibilidade de aprender e melhorar com a experiência, sem necessidade de haver uma programação específica para esse efeito. O foco é no desenvolvimento de programas que utilizem os dados disponíveis e consigam aprender por si. Os modelos matemáticos são construídos e alimentados com – potencialmente – grandes volumes de dados. Os algoritmos aprendem a identificar padrões e a retirar insights que são aplicados quando se processa nova informação. Este termo surgiu em 1959, quando o pioneiro Arthur Samuel definiu Machine Learning como a capacidade de um computador aprender sem ser explicitamente programado para o fazer.

Este processo de aprendizagem começa com processamento de dados, tentando identificar padrões. O objetivo principal é de permitir que os computadores aprendam autonomamente sem assistência humana, usando esse conhecimento para tomar decisões em concordância com o que foi “aprendido”. Embora os algoritmos de machine learning tenham estado presentes durante muito tempo, a aplicação destes cálculos matemáticos a Big Data, cada vez com maior fluidez, é um desenvolvimento mais recente. No entanto, de acordo com relatórios da indústria, o que se considera hoje em dia ser um crescimento tão exponencial nesta área, daqui a 50 anos será apenas visto como “baby steps”. É esperado que este ramo de IA cresça de forma muito acelerada nos próximos tempos.

Exemplos de Machine Learning

O interesse recorrente nesta prática é devido a alguns fatores que também tornaram o data mining e a análise Bayesiana extremamente populares. O crescimento no volume e variedade de dados disponíveis, o processo computacional que é mais barato e mais poderoso, e o armazenamento com um custo mais reduzido representam alguns dos atrativos deste ramo.

Alguns exemplos da aplicação de machine learning em algumas empresas incluem veículos de condução autónoma; recomendações de plataformas online como Amazon e Netflix com base no comportamento dos utilizadores; sistemas de reconhecimento de voz como a SIRI e Cortana; a plataforma Paypal assente em algoritmos de machine learning para combater a fraude, analisando grandes quantidades de dados do cliente e avaliando o risco; o modelo da Uber que utiliza algoritmos para determinar hora de chegada e determinar localizações de  partida; mecanismos de deteção de SPAM na conta de e-mail; reconhecimento facial que ocorre em plataformas como o Facebook.a

Indústrias apostam em Machine Learning

A maioria das indústrias com grandes quantidades de dados já reconheceu o potencial desta tecnologia. A possibilidade de extração de insights permite às empresas obter uma vantagem competitiva e trabalhar de forma mais eficiente.

Serviços Financeiros

Tanto os bancos como as demais entidades financeiras estão a usar machine learning com dois propósitos: retirar insights valiosos dos dados e prevenir a fraude. Os insights identificam oportunidades de investimento adequadas aos perfis dos clientes, e no campo da fraude, são identificados clientes de alto-risco.

Para além disso, com esta tecnologia também se consegue influir o nível de satisfação do cliente. Recorrendo à análise da atividade do utilizador, as smart machines conseguem prever um possível fecho de conta antes dele ocorrer, por exemplo.

Saúde

As entidades de saúde podem capitalizar na junção entre IoT e análise de dados, para desenvolver melhores soluções para os pacientes. O aparecimento de wearables permite a aquisição de dados relativos à saúde dos pacientes, que por sua vez permite aos profissionais de saúde detetarem padrões relevantes ou situações de risco. Esta tecnologia permite, por isso, melhorar o diagnóstico e o tratamento.

Retalho

Hoje em dia é bastante notório o impato das smart machines na experiência do utilizador. O resultado é um serviço altamente personalizado que inclui recomendações baseadas no histórico de compra ou atividade online; melhoria no serviço de apoio ao cliente e sistemas de entregas, em que as máquinas decifram o significado dos emails dos utilizadores e notas de entregas de forma a priorizar tarefas e garantir a satisfação do cliente; rastrear mudanças de preços, identificando padrões nas flutuações de preços, permitindo estabelecer os mesmos de acordo com a procura. A capacidade de reunir dados, analisar e utilizar os mesmos para personalizar uma experiência de compra (ou implementar uma campanha de marketing) é o futuro da indústria de retalho, por exemplo.

Transporte

Analisar dados para identificar padrões e tendências é chave para a indústria de transporte, uma vez que o aumento de lucro é sintomático de rotas mais eficientes e da previsão de potenciais problemas. A análise de dados e os aspetos modelares de machine learning são ferramentas importantes para empresas de entregas e de transportes públicos poderem aumentar os seus dividendos.

As aplicações de machine learning permitem às empresas automatizar a análise e a interpretação das interações do negócio, retirando insights valiosos que permitem personalizar produtos e serviços, em última instância.  Esta aposta na transformação digital irá certamente traduzir-se num investimento que se revelará uma decisão lucrativa para o negócio.

A Xpand IT tem um portefólio de serviços completo na área de Machine Learning. Se pretender saber como pode colocar Machine Learning ao serviço do seu negócio e obter um verdadeiro valor acrescentado, nós ajudamos. Quer saber de que forma podemos ajudar o seu negócio? Entre em contato connosco aqui e retire o máximo proveito desta tecnologia!

Sílvia RaposoMachine Learning: aprender de forma autónoma
read more

DevOps is not Dev & Ops – What I didn’t know about DevOps

All these years I have heard about DevOps, but I was truly convinced it was too techy for me.

I thought it was about continuous integration, automation, and awesome DevOps guys, who knew not only how to develop software but also how to release and manage production environments…

Now, I realise that I was completely wrong… DevOps is not Dev & Ops teams together… but an entire organisation that collaborates – really collaborates.

Of course you need automation; of course you need continuous integration – but that’s not all.

In a DevOps culture you must follow these rules:

  • Know the flow = understand how work goes from “to do” to “done”
  • You don’t work in a silo = instead of working in an isolated team that is just worried about their “own” work, you work for a purpose/value
  • You are constantly learning & improving = Don’t waste time – if something needs to be changed, change it

But how can you transform a whole organisation? Below, you can see some practical tips:

  • Make your work visible to everyone; don’t worry what others may think about it.
  • Change your mindset. Let me tell you a story, that someone once told me:

JFK, once when visiting NASA, saw a janitor cleaning the floor and asked him: What are you doing? He expected an answer like “I am cleaning the floor”, but instead the man said “I’m helping the men get to the Moon.”

  • Add value to your user stories; don’t create them just for someone to do something, but because you need to generate value, like improving customer satisfaction to 80%.
  • Collaborate, collaborate, collaborate even more… No man is an island, so don’t work like one.

Tools are not the most important element, but they can definitely help. Running shoes don’t make you a runner, but they will help you to run better.

If you are searching for tools that can help you understand the flow of work, make your work visible, and help you collaborate better with your team, just take a look at Jira, which allows teams to capture and organise work, assign it to the team and track team activity.

Sofia Neto

Collaboration & Development Solutions Manager, Xpand IT

Sílvia RaposoDevOps is not Dev & Ops – What I didn’t know about DevOps
read more