Feed aggregator

Video doorbell firm Ring says its devices slash crime—but the evidence looks flimsy

MIT Top Stories - Fri, 10/19/2018 - 11:30
The Amazon-owned security company says neighborhoods that use its products are safer, but the studies are unclear at best

AI fails: why AI still isn’t ready to take your job

IT Portal from UK - Fri, 10/19/2018 - 09:30

The fear of AI-fuelled job loss is spreading. No matter what industry you’re in, AI-powered bots and software are taking a crack at it. AI seems to be ringing the death knell for all manner of jobs, tasks, chores and activities. From doctors, to customer service, to household assistants, no job feels safe.

Naturally, this has people worried about the future. But is AI ready to take over our jobs, or even likely to do so ever?

Prevalent AI-fuelled failures would suggest not. Howard Williams, marketing director at Parker Software, explores how recent artificial intelligence mishaps show that AI still isn’t ready to take our jobs just yet.

From hospitals…

AI is going everywhere, and even doctors are feeling threated by the new tech. For example, it is now infiltrating hospitals to help with oncology, clinical trial matching and genomics. It all sounds rather impressive. Sadly, the AI hasn’t yet lived up to the claims.

It was reported earlier this year, following some leaked documents, that one such AI supercomputer had been poorly trained to assist with cancer diagnosis. The program failed to perform its basic function, instead making several incorrect and unsafe recommendations. In fact, it was suggested that the program isn’t usable in most cases.

In this example, then, even the best medical AI was still unable to perceive and understand things the same way a human can. Plus, being a good doctor isn’t just about diagnosis and treatment. There’s a distinct need for the more human traits that enable a good, reassuring bedside manner — something AI isn’t likely to achieve for a while yet.

In the future — if properly trained — AI may well find a permanent place in hospitals. However, it’s unlikely that AI will be capable of taking the entire role of a doctor. Rather, it will be a tool to assist them. It’d be more like a glorified computer than a human doctor — so, hospital computers might be out of a job. Doctors? Not so much.

When it comes to medical care, AI still needs to be propped up by human flexibility, empathy and understanding.

…To hospitality

Hospitality is another industry that artificial intelligence appears keen to get stuck into. But, once again, left alone, AI really isn’t very good at it.

Take bartending, for instance. A required, integral and basic ability for a bartender is the capacity to pick up a glass. Simple enough, unless you’re an AI robot. Not to mention, bartending is more than handling glasses. If AI can’t manage that, it can’t hope to recognise and manage drunken behaviour, or chat to regulars, or adapt service to suit a stag group or a corporate event accordingly.

Where AI might succeed in hospitality is in logical, fact-based tasks — such as streamlining the check-in process at a hotel. What it can’t do, is offer human understanding and a welcoming smile.

In other words, large areas of hospitality need the human touch to make them, well, hospitable.

Meanwhile, in business…

Hospitable service isn’t just for hospitality, either. Customer service jobs are also proving tempting for artificial intelligence. In fact, customer service is one of the best examples of AI being unable to take jobs, but proficient in supporting human team members.

There have been multiple AI fails when it comes to customer service. For example, Fabio the Pepper robot lasted only a week in his customer service role at a Scottish supermarket. Why? He confused and scared customers. Fabio’s failure highlights an important lesson for AI in customer service: you can’t bank on flashy tech alone to create a great customer experience.

This kind of AI customer service flop happens online and in apps too. Many chatbots have, at some point, failed to understand basic messages, or attempted to do too much at once. But chatbots have started to settle into their customer service role — alongside human employees, rather than instead of them. It’s here that the failures, and the successes of AI, demonstrate that AI can’t necessarily take over human jobs — but they can support them.

AI isn’t rising to supremacy in other areas of business, either. Take, for example, Amazon’s secret AI recruiter, which had to be scrapped after showing a distinct bias against women. The problem is, artificial intelligence doesn’t have a moral compass. It does what it is told to do but doesn’t comprehend whether it’s the right thing to do. Only the human touch provides that level of understanding, and it is humans that need to guide and support AI in business.

Putting your feet up: home assistant fails

Often, AI fails at home too. This might not threaten jobs, but it does give more insight into the abilities and shortfalls of current artificial intelligence. In particular, AI home assistant failures show the shortfall in voice recognition.

It turns out, even established AI assistants have bad days, and Amazon’s Alexa provides us with the perfect example. The AI assistant ordered an expensive doll house and cookies when asked by a six-year-old. When the instance was reported on the news, several other Alexa appliances attempted the same thing, at the request of the TV. It could not differentiate child and adult voices, and it listened rather too closely to the television set.

But Alexa is still pretty good, and popular in a lot of homes. Plus, newer AI assistants will be able to do even more, right? Well maybe, but if LG’s Cloi (pronounced kloh.ee) is anything to go by, we’re still a long way off. At the unveiling of the helpful smart home assistant, Cloi, at CES 2018, the AI repeatedly failed to respond to requests. This made for an awkward and embarrassing unveiling — one that demonstrates that we still have a long way to go before AI voice recognition is reliable enough to enter the workplace.

Artificial intelligence is supposed to enrich our lives and make things easier. That’s the goal with our AI assistants and home, and it should be the goal with AI at work too. AI still has a long way to go. Even when it is doing well, it needs the human touch to augment its ability.

The hype, the headlines and the human

The rumours of the impending AI takeover have been greatly exaggerated. Headlines and hype have fuelled the fear of an AI-pocalypse.

The reality is far less imposing. AI will be able to handle some of our daily tasks. It might be able to handle a few of our jobs. But it can’t do anything without humans behind it, not just to maintain and monitor it, but to provide the human touch that AI can’t.

Artificial intelligence is becoming capable of an increasing number of tasks. But only humans have the empathy, understanding and flexibility that not only guide AI but make jobs worth doing.

Howard Williams, customer experience, Parker Software
Image Credit: PHOTOCREO Michal Bednarek / Shutterstock

A GDPR storm is coming – are you prepared?

IT Portal from UK - Fri, 10/19/2018 - 09:00

Cast your mind back to early 2018. The world was alive with the sound of GDPR commentary. In the run up to the May compliance deadline everything was up for debate. Would it spell the end of marketing as we know it? Was anyone actually compliant? Was it good news or bad news for businesses? And, getting the most air time – would GDPR be a damp squib like the Cookie Directive?

If you were of the opinion GDPR was a lot of hot air, the intervening months may feel like vindication. GDPR has largely gone off the agenda of most media publications and with it the minds of many business owners. However, we’re merely in the eye of the storm. In the last few weeks Facebook, and now Twitter, have been squarely in the crosshairs of regulators for allegedly failing to comply with GDPR. The EU has issued a stark warning that big fines will be handed down before the end of the year. Similarly, the ICO has ramped up its warnings that major action is likely to be taken. Added to this momentum has been a seemingly endless series of high-profile data breaches with Google+ the latest casualty.

For business owners who put their GDPR compliance on the backburner since May, the warnings could not be clearer: If you aren’t GDPR compliant you’re likely to be in some serious trouble in the next few months.    

Facebook has quickly become the poster boy for poor data governance procedures. Cambridge Analytica, data breaches, and GDPR failures have all come in quick succession and provide a case study for businesses on how not to collect and manage data. While it may be tempting to revel in some schadenfreude, a better approach is to see what every business can learn from Facebook and how they can protect themselves from the expected GDPR storm.

Kryptonite for data management

First, it should go without saying that financial organisations hold some of the most sensitive personal data. Thankfully, the most important data linked to account information has largely been well protected. However, having high security standards around bank accounts can breed complacency especially when you consider it’s not the only information the average financial company holds. The marketing, customer service and sales departments will all, usually, have their own customer databases which may be subject to vastly different security and governance standards. A breach related to any of this data could be fatal to a financial organisation and result in hefty GDPR fines.

General complacency is kryptonite for data management and protection. For Facebook, its complacency manifested itself in lax standards, questionable practices and a belief it would never be brought to account. For financial organisations, it can lead to blind spots related to data that is deemed less ‘sensitive’. Often, to enable smooth marketing, client management and sales operations, customer data is more readily accessible than financial information, shared with more parties, updated more frequently and inputted into more platforms. Each of these processes increases risk. Compounding this issue is a general lack of education related to the power of this data to do harm. Many would ask, what use is an email address to a hacker? The short answer is, a lot. This is why GDPR seeks to protect every piece of personal data.

If you’ve got to this point in this article and you’re beginning to feel some doubt surrounding your data practices – good. Now is the perfect time to audit and review all your data processes and security standards. The baseline should be – is everything GDPR compliant? If it was in May – is it still compliant? New technology, teams and initiatives can all impact your data processes and result in non-compliance.

A culture that personal data 

If you avoided all of this in the faint hope that GDPR wasn’t going to be an issue, you need to get on it immediately. In this instance, buying in technology and availing yourself of the services of specialist consultants will be the fastest (but not the cheapest) option.

Next, what is the general understanding of your staff? All the procedures and technological safeguards will mean nothing if your colleagues do not understand what GDPR is and the danger of data breaches. Undertaking company wide training regularly and incorporating data management expertise and ethics into staff development and assessment can be a powerful way to measure and improve education.

Finally, if the worst happens and there’s a breach – are you prepared? Time and again we see that a poorly handled response to the data breach generally do more damage than the breach itself. Again – I’ll point to Facebook and its slow, incomplete and unsatisfactory responses to each and every data issue it has encountered.

Slow responses are symptomatic of a failure to have the right procedures in place. This can be because there is no technology or expertise available to identify the breach in the first instance or the right people are not empowered to make quick decisions. You need to start from the position that any breach, no matter how minor it appears, is serious. It should be reported to a specialist team led by the CEO. Within that team should be the IT lead, marketing, customer service and legal. Consumers should be informed as quickly as possible, both to be GDPR compliant, and to reassure. The business needs to identify who is impacted, how, what went wrong, how it can be fixed and how consumers will be protected in the future. The faster these boxes are ticked and communicated the better the end result – especially if the ICO gets involved. As with anything, practice makes perfect. Conducting wargames and drawing up ideal responses and contingencies with this team could make all the difference.

We now live in a world where the reputation and future of a company can be destroyed by hacks and data breaches. Organisations are generally to blame for this environment. There has long been a culture that personal data is a commodity that businesses can deal with as they wish. Now the wheel has turned. If you’re one of the many business owners that still believes that data governance is just something for the IT department to worry about – you’re going to be in for a big surprise. By the end of the year, a number of large businesses will be hit with near-fatal fines as a warning to other companies. Acting now will ensure that your company is not one of these cautionary tales. 

Julian Saunders, PORT.im
Image source: Shutterstock/Wright Studio

Amazon to add 1,000 UK jobs

IT Portal from UK - Fri, 10/19/2018 - 08:30

Northern England, namely Manchester, is getting its first Amazon office. This is according to a new report by Reuters, which claims that the US retail giant will be looking to employ at least 1,000 people.

However, not all of them will be hired in this new Manchester office. The company is also looking to expand its two other centres. The roles will mainly be in research and development, it was said.

Roughly 600 positions will be either corporate or development jobs, with 250 positions in Edinburgh’s development centre and 180 positions in Cambridge.

Amazon’s UK country manager Doug Gurr said Britain was taking a leading role in the company’s global innovation. “These are Silicon Valley jobs in Britain, and further cement our long-term commitment to the UK,” he said.

The new hires will work on Amazon tech for personalisation in shopping, machine learning, Alexa, Amazon Web Services (AWS) and Prime Air, the company’s drone delivery project.

Earlier this month, it was reported that Amazon was looking for retail space in London, to open its first Amazon Go store. Those are cashierless stores in which customers can simply walk in, grab what they need, and walk out. In-store sensors and a mobile app take care of the payment.

Image Credit: Ken Wolter / Shutterstock

Apple reportedly set to ditch Intel for Arm hardware

IT Portal from UK - Fri, 10/19/2018 - 08:00

The rumour that Apple is about to ditch Intel for Arm chips, for its Mac line is back. This time around, it's Ming-Chi Kuo of KeyBanc Capital Markets who's making the prediction.

Issuing out a research note, Kuo says that the switch from Intel to Arm is quite likely to happen in either 2020 or 2021.

The media are all over this story, claiming there is some credibility to it. The Register is saying it would make perfect sense for Apple to make the jump, because it would be cheaper, the company could be able to control and upgrade the hardware as it sees fit, and it would eliminate potential supply woes, which have, apparently, been plaguing Intel in the past.

As an extra argument, it is also being thrown around that Apple is no stranger to switching hardware manufacturers for its devices. In the early 2000s, the company moved from PowerPC to Intel.

Extreme Tech argues that another reason why Apple might be considering the switch is the trouble Intel has had with 14nm and 10nm roadmaps.

But perhaps the most important argument of all is the fact that Microsoft's Windows 10 already runs on ARM. Obviously not as good as it runs on Intel, but it's something. And as ARM chips become faster, more energy efficient and cheaper, a move will become less and less surprising.

Image Credit: Pio3 / Shutterstock

Scrutiny on the data supply chain

IT Portal from UK - Fri, 10/19/2018 - 06:30

Although the term ‘supply chain’ is most commonly associated with manufacturing, it is now frequently being applied to the management of data within financial services firms. While these firms deal with growing volumes of raw data as opposed to raw materials, the principles of the supply chain remain the same.

As with any supply chain, being able to trace materials or data across the whole process is very important. In the data supply chain, businesses within the financial services space need to understand and to audit what happens to the data across the process, who has looked at it, how it has been verified and must keep a full record of any decisions that are made. Ultimately, they need to ensure traceability, that they can track the journey of any piece of data across the supply chain and see both where it has been and where it ends up.  

The advantage for financial services firms who reach the end of this data supply chain is that the result of this process supports informed opinion that in turn drives risk, trading and business decisions. Bringing the data together in this way is important for many financial services firms. After all, the reality is that these businesses, today even more than pre-crisis, typically have many functional silos of data in place, a problem made still worse by the preponderance of mergers and acquisitions taking place across the sector in recent times. Today, it is commonplace that market risk may have its own database, so too credit risk, finance stress testing and product control. In fact, every business line may have its own data set. Moreover, all these different groups will all also have their own take on data quality. 

Many financial services firms increasingly appreciate that this situation is no longer sustainable. The end to end process outlined above should help to counteract this but why is it happening right now? 

There’s no doubt that regulation is a key driver. In recent years, we have seen the advent of the Targeted Review of Internal Models (TRIM) and the Fundamental Review of the Trading Book (FRTB) both of which demand that a consistent data set is in place. It seems likely that the costs and the regulatory repercussions of failing to comply will go up over time.

Additionally, it is becoming costly to keep all these different silos alive to support it. The staff who originally developed them are often no longer with the business or have a different set of priorities, making for a very costly infrastructure. Lastly, there is a growing consensus that if a standard data dictionary and vocabulary of terms and conditions are used within the business, and there is common access to the same data set, this will inevitably help to drive a better and more informed decision-making process across the business.

Developing a solution

In order to address these issues and overcome the data challenges outlined above, businesses should begin by ensuring that they have a 360˚ view of all the data that is coming into the organisation. They need to make sure they know exactly what data assets there are in the firm – what they already have on the shelf, what they are buying and what they are collecting or creating internally. In other words, they need to have a comprehensive view of exactly what data enters the organisation, how and when it does and in what shape and form.

This is far less trivial than it might sound because in large firms in particular, often due to organisational or budgetary fault lines, organisations may often have sourced the same data feed multiple times, or they might find that the same data product or slight variations of it may be brought into a business on multiple occasions or via different channels.

Therefore, firms need to be clearer not only about what data they are collecting internally but also what they are buying. If they have a better understanding of this, they can make more conscious decisions about what they need and what is redundant and prevent a lot of ‘unnecessary noise’ when it comes to improving their data supply chain.

They also need to be able to verify the quality of the data which effectively means putting in place a data quality framework that encompasses a range of dimensions from completeness to timeliness, accuracy, consistency and traceability.

Of course, to deal with all these data supply chain issues businesses need to have the right governance structure and organisational model in place. Consultants can help here in advising on processes and procedures and ensure for example that the number of individual departments independently sourcing data is reduced and there is a clear view in place of what is fit for purpose data.

If the right processes and procedures are in place, however, alongside a good governance structure, the organisation can start to think about a technological solution.

The use of technology

Technology can play a key role in helping organisations to get a better handle on their data supply chains. For most businesses, a primary requirement is to have good data sourcing and integration capability in place. This means systems that understand financial data products but also the different data models and schemes that are in place to identify instruments, issuers, taxonomies and financial product categorisations.

Organisations also need the ability to support the workflow process and workflow, as well as data reporting capabilities. Technology chosen to fulfil these roles must be capable of providing metrics on the impact of all the different data sources the organisation has bought, what benefits it has achieved from those sources; what kind of quality are they and what gaps are there in the data, and where is the organisation in providing this data to business users for ad-hoc usage.

In addition to understanding and monitoring their supply chains and ensuring that an auditing and traceability element is in place, financial services firms must also guarantee that data governance and data quality checking is fully implemented. After all, to get the most from their data supply chains they must make the data itself readily available to users to browse, analyse and support decision-making processes that ultimately contribute to driving business advantage and competitive edge.

Martijn Groot, VP of Product Management, Asset Control
Image source: Shutterstock/alexskopje

BlackBerry vs Facebook – a David vs Goliath conundrum

IT Portal from UK - Fri, 10/19/2018 - 06:00

“When, as a culture, did we stop loving giant-killers? When did we stop appreciating the small, plucky fighter who could stand up to the behemoth? In school I heard about David and Goliath; at home, I now read Jack and the Beanstalk to my young child. It’s a pretty universal theme, and in my view a worthwhile one. So, what happened to our collective love-affair with the little guy? 

Earlier this year, BlackBerry sued Facebook in the US (United States District Court for the Central District of California) for infringement of multiple patents, relating to various aspects of Facebook Messenger, WhatsApp & Instagram. 

What surprised me most about this case at the time was the level of hostility that seemed to be brought to bear against BlackBerry in certain sections of the non-specialist press. Also, it was odd how charitable some journalists seemed to be to Facebook. Fundamentally, Facebook was being accused of infringing BlackBerry’s intellectual property rights. After all, would one look so charitably if a small, unheard-of company or individual were accused of counterfeiting Louis Vuitton bags, or of copying the Harry Potter books wholesale? I would guess not. 

Also, there seemed to be some (how can I put this) confusion in the lay press regarding a few fundamental principles relating to patents. I would like to try to clear that up here.    

For a start, some commentators seem to have complained that BlackBerry cannot possibly sue Facebook with respect to the features alleged to infringe (such as certain features related to instant messaging) because those features have become so very common in the modern world; ubiquitous, one might say. This is very far from the point. It makes no difference at all how common the features have become in the present time, other than to the extent that BlackBerry might be able to obtain more money from Facebook in damages. What matters is whether the claimed inventions in the patents were new and inventive when the original patent applications were made.  

No 'use it or lose it' principles

I have looked at the patents in question. They involve a variety of inventions, from generating cryptographic keys to linking a messaging service with a game. Some of the applications date back to 2010 or 2011; one of them even seems to date back as far as 2001. So the question is: were the claimed inventions new and inventive back then, when the patents were applied for? Answering this question typically involves asking a technical expert in the relevant field (computer science, telecommunications) to cast their mind back to the relevant date (which, if it’s a decade or more ago, is no simple task in such a fast-moving field) and ask whether someone in the field at that time would have come up with the patented invention just as an obvious, workshop development of what was around at the time. One must put knowledge of the patent itself, and everything else that post-dated it right up until the present day, out of one’s mind. Again, that’s not easy. But that’s the acid test. 

Also, some people were seemingly criticising BlackBerry because they no longer make telephones (talk about kicking someone when they’re down), and are trying to make money from their patents. Well, what’s wrong with that? The whole point of patents is that, in return for disclosing the details of the invention to the public, on a public register, the patentee gets a monopoly for a limited time (typically 20 years). That’s it. 

There’s nothing in patent law saying that you have to sell the thing you have invented. There’s no “use it or lose it” principle here. Also, there’s nothing saying that you have to have spent years and/or millions on coming up with your invention and getting your patent, even though in this case BlackBerry might well have done exactly that. You (and this means you too, dear reader, because anyone can be an inventor) come up with the idea for the invention, then you apply for the patent and, if it’s new and inventive, and described well enough to allow someone else to do what you have invented, you get your monopoly. That is what patents are. They are a reward for having made the details of your invention public. That still applies whether BlackBerry make telephones or not. 

Potential downsides

Facebook have of course defended themselves by saying that the patents in suit are invalid. All of that is fine. If the patents are not valid, and/or if they are not in fact infringed, then of course BlackBerry should not get a penny. But that might be a big “if”. 

Facebook have also retaliated by suing BlackBerry for infringement of several of Facebook’s own patents. This seems to be a change of tone from Facebook, certainly in light of earlier claims from Paul Grewal, Facebook’s deputy general counsel, to the effect that BlackBerry was seeking to tax other people’s innovation by way of their lawsuit. But in any event, Facebook of course have every right to take such action. 

To go back to the original lawsuit, though, the potential downsides of Facebook losing the case filed against them could be costly, to say the least. First, BlackBerry might obtain damages from Facebook for acts of infringement, which damages might be based upon how much money Facebook have made from the infringing features, or perhaps on the amount of money that Facebook would have paid BlackBerry had they taken a licence for the patents. 

Also, maybe even worse than that, Facebook could be slapped with an injunction preventing them from making available the relevant applications for the life of the relevant patent(s). The threat of an injunction is perhaps lowered in the US by the eBay decision, which established a principle that injunctions should generally only be granted where (amongst other things) the patentee suffers irreparable injury as a result of the infringement. (This is the one area where it might actually be relevant that BlackBerry are no longer selling telephones.) However, if an injunction were to be awarded, it could potentially cost Facebook even more than a damages award in terms of lost money, inconvenience and embarrassment. Also, in other jurisdictions (like the UK), where there is no EBay decision, it could be much simpler for a patentee to get a final injunction, if indeed they sue here as well. 

Some people might argue that, far from large IT companies being oppressed by frivolous lawsuits (hand me my violin), the opposite is in fact the case; such companies can often arrogate to themselves other parties’ inventions without so much as a by-your-leave, and then effectively dare those other, smaller, parties to take action. Such actions are costly in both money and management time, and the fact that large IT companies tend to have such vast resources can lead to an imbalance of arms. Fighting giants, far from being a fairy-tale, can in practice be a horror story for the (relatively) little guy, spending time and precious money trying to enforce their legal rights against a foe with seemingly endless resources to throw at them. 

If BlackBerry are eventually successful against Facebook, they might even consider proceeding against other parties. Indeed, a success against Facebook might give encouragement to other patentees to take on Facebook themselves, or big companies like them. If the patentees’ rights are strong, and their complaints are well-founded, then so much the better as far as this commentator is concerned. Long live the giant-killers.”

Matt Jones, Partner, EIP
Image Credit: Blackberry

Know your audience – the people behind the data

IT Portal from UK - Fri, 10/19/2018 - 05:30

Requests for data and data sharing are now commonplace in consumer interactions with retailers and brands. Virtually every interaction consumers have with a brand requires some degree of data exchange and businesses must embrace the insight consumer data provides and use it to enhance their overall customer experience (CX). It is important to note that today’s consumers are more empowered than ever to choose how, where and when they purchase so the relationship they hold with a brand can provide a key driver in their decision-making process.

There is no doubt that data collection and analysis provide valuable opportunity for businesses to gain insight and deep understanding into customer preferences. This insight can be used to transform the experience a customer has with a brand by enabling a truly personalised service placing the customer at the centre of each interaction. It is now widely accepted that CX has the power to drive loyalty over and above other factors including price. The 2017 Gartner Customer Experience in Marketing Survey revealed that 81 per cent of those businesses surveyed, said they expect to be competing mostly or completely based on CX in two years’ time[1]. It is no surprise that businesses are scrambling to ensure their CX is able to compete.

In the battle for CX enhancement, the value of data cannot be underestimated. Applied effectively it can provide businesses with rich insight into the personal characteristics that make up the human behind a data point. Data is a precious asset and when used correctly, businesses can develop detailed profiles of their audience and use it to foster more personal (and profitable) connections.

Building personal connections through data

Customer segmentation is a popular and well-established method of ensuring relevance between consumer and brand interactions. Building strong customer profiles enables businesses to identify who their customers are, what they are interested in and how best to interact with them. This insight enables businesses to market to consumers in a way in which they find relevant. Carefully analysed data allows businesses to not only base customer communication on past and previous interactions but also builds a predictive picture of likely future behaviours and buying preferences. 

The reason this method works is that communicating with consumers in a meaningful way is only possible when a personal connection is made between the brand and the individual. Despite technological advances, emotional connections remain a powerful driver for consumers when choosing a brand - people like to interact with companies they feel understand and care about them. A Customer Thermometer survey conducted across 1000 people in the United States found that 65 per cent of people felt more connected to brands they felt cared about them. 

For this reason, brands must look beyond transactional data when profiling customers and instead develop an understanding of the people behind the profile. The use of data to analyse customers should provide a three-dimensional view which identifies buyer preferences, motivations and needs while mapping how the brand can truly satisfy the customer with every interaction. Producing a clear representation of consumer activity provides businesses with valuable insight into individual behaviour and key drivers. By drilling down into customer behaviour in this way, brands can create more valuable interactions that drive trust and customer loyalty enabling businesses to reach out to customers on an emotional level.

Getting it right

The mistake many retailers make is deciding what type of customer they would like to attract and trying to make their product or service suit the desired group. When profiling customers businesses need to firstly ascertain who is buying the product not who the brand would like to sell it to – this should clearly identify a number of profiles which can be targeted in different ways.

High end supermarket Waitrose provides an example of a retailer that has successfully identified its customer profile and uses this data to interact with its known customer base in a considered way. Waitrose doesn’t try to compete with other supermarkets on price and instead focuses on the quality and shopping experience it knows its customers want. The MyWaitrose loyalty program offers members the opportunity to save up to 20 per cent on their most frequently purchased items offering customers discount on items most often purchased and providing a scheme that is tailored to individual customers and is based entirely on their specific preferences. The approach not only delivers a loyalty program which is different from the others but is very specific to individuals preferences driving a very personalised service.

A matter of trust

The emotional connection an individual holds with a brand is reliant on consistency, service and trust. The introduction of the General Data Protection Regulation (GDPR) on 25th May this year along with numerous news stories surrounding data sharing and data breaches in the NHS, Facebook and most recently British Airways, has highlighted the power of data, bringing it to the forefront of consumer minds. There is now a much greater degree of awareness amongst the general population about personal data use and misuse. According to a report released in February by the DMA, the clear majority of consumers (78 per cent) believe that businesses benefit disproportionately from data exchange in the UK. Businesses have a responsibility to treat consumer data with respect by protecting the rights of the individual. Using data ethically and only in considered ways that enhance the individual’s experience ensures trust is front and centre in the company-consumer relationship. Trust in an organisation remains the dominant prerequisite when engaging consumers within the data economy. In fact, 54 per cent of respondents to a recent DMA survey ranked this option in their top three considerations for data exchange with a brand.

The GDPR applies much greater scrutiny to businesses that collect, hold and use personal information and consumers are much more likely to question motives for requesting their personal details.

Data unlimited

Data collection provides valuable insight for businesses which can greatly enhance the customer experience with a brand however the most important factor is how that data is used. To fully unlock the potential of consumer data, businesses must use it to create a detailed three-dimensional view of the customer. 

By combining transactional data along with individual consumer behaviours, businesses build the ability to fully understand the people behind the data. This level of detail allows businesses to use the information to deliver a personalised service while cultivating more valuable relationships with customers, driving better service and generating great results for the business.

Stuart Robb, founder and CEO, Equiniti Data
Image source: Shutterstock/Jirsak

Cost comparison – cloud vs on-premise 2018/19

IT Portal from UK - Fri, 10/19/2018 - 05:00

As companies look ahead to 2019 and where they should be investing in IT, cloud is bound to be on the agenda. But what do the latest cloud cost comparisons look like and who should be investing in cloud? How does cloud stack up against on-premise?

Ultima has recently researched the cost of cloud vs on-premise computing and found some interesting comparisons. Importantly, because of advancements in software that can be deployed, the functionality of on-premise platforms has increased drastically allowing organisations to consume on-premise IT in a similar manner to public cloud.

To allow us to compare the two offerings we looked at a similar sizing for the infrastructure (for both on-premise and cloud) as follows:

  • 600 virtual machines
  • 4 vCPUs (virtual central processing units) per virtual machine
  • 16GB RAM per virtual machine
  • 256GB storage per virtual machine

There are five different platforms that have been considered for this, each with their own benefits and pitfalls:

  • Public cloud platform based within the UK
  • Public cloud platform based within the European region
  • A modern infrastructure based on software driven server, storage and networking
  • A hyperconverged platform, where the compute and storage scale linearly together
  • A traditional IT infrastructure, a hardware driven platform that lacks flexibility
Good news

We found, even with the additional functionality all of the on-premise, that costs have dropped. The same can also be said for the cloud costs, given how new the UK platform was last year the costs here have dropped drastically. The graph shows how some costs have been changing at a far more rapid rate than others.

If you apply a different style of migration you would end up with a very different graph; the cloud becomes much more cost-efficient if you start leveraging the additional services, auto-scaling instances and rapid scaling for instance. However, these services require a much more complex, and therefore more costly, migration.

SME vs large business

For smaller businesses despite this year’s reductions the cost of running on-premise systems is still a prohibitive overhead, therefore SaaS (Software as a Service) and IaaS (Infrastructure as a Service) still work more effectively.

For those businesses needing to scale quickly there might be a need to provide bespoke applications to suit specific business use cases. The need to run these applications and build platforms to support a diverse range of applications becomes paramount.

Another important aspect of this analysis is based on a static and predictable workload, however, in the ‘real’ business world not every application is of this nature and businesses need to take this into account. Workloads that vary widely throughout their lifecycle may well end up as hybrid or fully public cloud workloads. As businesses evolve and grow in the next year there may be a need to use applications that can leverage alternative capabilities that cloud services can provide, such as machine learning, if this doesn’t exist on-premise.

Cloud vs on-premise

The research highlighted the public cloud should be used when it allows businesses to augment its on-premise capabilities. Every workload has its own characteristics and requirements and it should be based on this mix that businesses select the best option.

For example, if a business is using VDI (Virtual Desktop Infrastructure), but it is only being used between 9am-5pm in the UK, it could utilise the cloud and auto scaling to reduce the cost during the night; whilst still enabling the business to utilise the number desktops needed during the day. The other option is to re-architect your applications to take advantage of PaaS (Platform as a Service) and SaaS technologies. If this is carried out, then the costings will improve too.

Businesses using cloud services might not look to pocket these savings, but to reinvest them in order to gain a higher level of functionality within the platform. This could include software for automation and network virtualisation.

Looking ahead

Saving the best news for last, it is highly likely that these costs will continue to decrease in 2019. This is due to the increasing supply of flash drives from the new fabrication plants that have come online in 2018 and which will lessen hardware costs. This, coupled with the new competitive nature of AMD (Advanced Micro Devices) within the server market, means the costs for both on-premise and public cloud will decrease.

Increased competition in the network virtualisation world with two goliaths fighting for control (Cisco with ACI and VMWare with NSX) should also be good news for businesses looking at these types of technology.

Over the next 12 to 24 months we are also going to see a big change in how applications are built and designed. We will move away from monolithic applications into a world built on microservices which will allow businesses to take advantage of a hybrid workload which is being load-balanced across both public and private platforms.

So, what’s right for your business?

We recommend that any company considering moving to the cloud undertakes an assessment to help them plan their journey and gain senior management buy-in. A key activity within the engagement is the discovery and subsequent assessment of the existing environment and requirements. This information should be gathered through a discovery processes, including workshop discussions with business stakeholders to ensure the business requirements are identified thus allowing solutions to be identified that fit these requirements.

Once this has been done, a review process should start of the potential proposed services in line with the captured requirements to ensure compatibility and completeness. Any incompatibilities with the proposed solution should be identified at this stage. A decision can them be made as to the best approach to take whether on-premise, public cloud or a mixture of both and which solutions provider best meets your needs.

After finding the right solution and the right solutions provider, the IT team and solutions provider should produce a jointly developed strategy document which can be presented to the business. The presentation should fully explain the various options proposed, the reasons behind them and next steps. Only with such a structured approach will the business be able to reap the benefits of using the public cloud services whilst still keeping costs under control.

Matthew Beale, Datacentre Solutions Specialist, Ultima
Image Credit: Everything Possible / Shutterstock

Want to know when you’re going to die?

MIT Top Stories - Fri, 10/19/2018 - 05:00
Your life span is written in your DNA, and we’re learning to read the code.

Top six CRM apps for startups

IT Portal from UK - Fri, 10/19/2018 - 04:30

In the world of today, people have centralised and more efficient tools to manage their business activities. This includes management of their finances, inventory, marketing initiatives and returns. These tools and applications are capable of providing a clear picture of resources and investment. They can further assess their investment and the returns they are getting. These tools are effective in providing startups and SMEs insights and information on their activities. This helps them stay aware of their current and future financial standing. 

Salesforce:

Salesforce is an on-demand customer relationship management or CRM suite. It offers applications for SMEs with a great emphasis on sales and support. The Salesforce app has the capacity to manage sales, automate marketing campaigns, and manage customer accounts and much more. It helps users track sales leads and much more. It keeps a track of marketing campaigns and your post-sale services. You can also use Salesforce for Outlook to sync your contacts, calendars, tasks, and emails using both the applications. For small-scale businesses, users get to manage their contacts, keep a track of sales, and manage tasks and events and the use of leads. One can also keep a track of their performance. The CRM Salesforce solutions are available as SaaS.

Rating: 4.5

amoCRM:

This is one of the cloud-based CRM solutions that enable users to manage their sales pipeline. One can get feedbacks and reports about the performance of the concerned salesperson. The solution is very effective for lead scoring and sales analysis etc. AmoCRM is also capable of providing full visibility of sales. It can be further broken down as lead count, revenue, sales rep etc. It allows users to create custom fields and unique that help them to organise deals and contacts. One can download the mobile version of the app that is available for iOS or Android. The application is available on a monthly as per user. These guys also offer customer services via email and via phone.

Rating: 4.5

Infusionsoft CRM Software:

Infusionsoft CRM Software is one of the best cloud-based solutions. Startups and SMEs can use it for managing sales and marketing campaigns. It offers customer relationship management solutions, marketing automation. It also provides functions for your e-commerce initiatives in a suite. For small businesses belonging to different industries, they help in many ways. It helps them deliver sales volumes and customer services. The application becomes a great analytics tool. It further helps users to conduct analysis depending on various determinants. These determinants can be emails, the performance of your campaigns, the sales, and revenue. This way, they can keep a track of your ROI for your sales and marketing campaigns. One can get access to the application using smartphones and tablets. Also, the app is available on Android and iOS. There are other integrations available including Gmail, SalesForce, Outlook, and QuickBooks. The application is available using a monthly subscription. The support features include support via phone, FAQs, and emails.

Rating: 4.0

FreeAgent CRM:

The Free Agent is one of the cloud-based sales, customer services, and marketing platforms. It is ideal to help out SMEs so that they can develop good and healthy relations with their customers. The system is automatic and gets paired with the user’s email address to create and clarify a lead. One can also get follow-up reminders, perform custom and automated operations and track down all interaction with customers. FreeAgent CRM displays the latest actions and complete history of your customers. You can also assign real-time follow-up so that you can get an increased response time. Having a good response time which is favorable for your company’s/brand’s reputation. The tool is customisable. This means that a user can create his or her own fields, hints, status/stages and reports to get a good understanding of their marketing campaigns. FreeAgent has the capacity to get along almost any device. It also comes with a dedicated mobile application.

Ratings: 4.5

Agile CRM:

Agile CRM is one of the cloud-based CRM solutions designed to meet the needs of an SME. It offers users to manage their contacts and work as a telephony tool too.  You can schedule appointments, automate marketing initiatives, landing page builders and knowledge base functions. Users can not only manage the records of their customers but also get access to customer data. This data can be contact information, interaction records, social media accounts, and lead scores. Email integration is one of the capabilities that allows users to sync their data with the various email services. Not only this, there are other services that users can make use of. There is a way that allows users to keep a track about website visits to carry out different analysis on customer behavior and other dynamics. Also, the CRM service is capable of getting you a way to manage your tasks. You can drag and drop their tasks in a list, sort the tasks at hand and add notes. Also, you can update tasks for yourself and for other members of the team. 

Rating: 4.5

Salesmate:

Salesmate is one of the CRM solutions that are cloud-based and helps SMEs throughout the industry. It is capable of performing many tasks. These include tasks like contact management, managing your sales that are in your pipeline, email marketing, and chat and phone integration. The contact management module offered by Salesmate helps users to import contacts from Google, your mobile phone and Excel sheets. It then stores them in a centralised way and assigns contacts and leads to sales and also allows them to manage sales. This means that they can view, organise and set priorities for their sales as well. Also, there is an automated way to create follow-up tasks as well. All this and more is part of this CRM solution on a monthly subscription. It allows you to include support features using support via phone and email.

Rating: 5.0

CRM solutions using Mediacom internet can be an ideal combo for use by emerging companies and startups in the world. These are capable of making their life easy with some outstanding features. 

The above mentioned CRM solutions are our pick from the bucket of supercool applications. If you are a startup or an SME looking for a CRM solution, you can have a look at the above options to select from. 

Nathan John, content editor, Mediacom cable
Image Credit: NakoPhotography / Shutterstock

The rise of ATM Attacks: What are the risks to banks and businesses?

IT Portal from UK - Fri, 10/19/2018 - 04:00

When cyber-criminals are intent on exploiting vulnerabilities in the security surfaces of financial institutions, ATM systems can serve as primary access points. While ‘smash and grab’ attacks on ATMs are nothing new, in the rapidly evolving world of cyber-crime, cash machines are now a focus for operatives aiming to siphon bounty ranging from customer data to old-fashioned cash.

The past few years have seen a spate of attacks on ATMs in the UK using trucks and stolen farm machinery.  The aim is to steal the ATM intact and transport it to a site where the cash can be extracted by force. The alternative is ‘smash and grab,’ breaking into the ATM on site to extract funds. Since 2016, almost 100 attacks of this type on ATMs using gas explosions were recorded by police in England and Wales. This included 23 attacks by a single gang over a three-month period, which saw more than £1.5 million stolen across midlands. 

But there is a new threat to be mindful of – one that isn’t physical but in the world of cybercrime. This summer when the FBI issued a warning about an imminent global cyber-attack on commercial bank ATMs.  Known as an ATM ‘cashout,’ the pre-empted attack centred on the hacking of a bank or payment processor to enable the fraudulent withdrawal of funds using cloned cards. This is typical of a sophisticated hack that can impact consumers directly while derailing the operations of banks and businesses.

Threat categories

Over the past decade, ATM malware has developed rapidly. A 2017 European ATM Crime Report by EAST showed a 287 per cent rise in ATM black box attacks on the previous year.  And while cyber-security solutions can deal with an array of infrastructural vulnerabilities, ATM hardware and operating systems often remain a particular weakness.

ATM attacks fall into two categories: physical or logical. A physical attack sees the perpetrator present before, during or after the crime. It involves the use of physical force to compromise the machine and is still very prevalent in the UK. The FBI warning concerned a logical attack, which generally involves malware and specialist electronics to gain control of the ATM and access to customer data and funds.

Skimming the top

Theft at the ATM interface is becoming more sophisticated and profitable. According to Diebold Nixdorf, the ATM manufacturers, ATM 'skimming,' now has a global cost exceeding $2 billion.  Skimming is the act of syphoning customer data at the ATM using hardware that mimics the appearance of legitimate machine components. The technology needed is easy to legally purchase online.

While methods and components vary greatly, skimming hardware is now more discreet and effective, and is often virtually impossible to spot. Some equipment is now as thin as a credit card and can be installed inside the ATM’s card slot. Once operational, the ‘skimmer’ can syphon the card details of unwitting consumers – sometimes directly to the perpetrator’s mobile via Bluetooth.

Hitting the jackpot

The most sophisticated form of logical ATM attack is referred to as ‘cashout’ or ‘jackpotting.’ This approach involves infecting an ATM with malicious software. For instance, an early form of this type of attack involved the transfer of malware to the ATM on a USB through an interface portal. Modes of infiltration have since become more effective and require even less involvement by the hacker.

As recent research by EAST shows, ‘black box’ ATM attacks have been on the rise in Europe. To perform this type of jackpotting attack, the perpetrator connects a device known as the ‘black box’ to the ATM’s ‘top box,’ or the interior of the machine. The device then reverts the machine to supervisor mode and dispenses cash. The good news for banks is while the number of planned black box attacks in Europe have been increasing, the rates of criminal success have been falling due to the work of agencies such as EC3, Europol’s European Cybercrime Centre.   

Smart precautions

Financial gain is the motive behind 90 per cent of all cyberattacks, and unsecure ATMs present a soft target for criminals. Hackers are constantly looking for vulnerabilities across the spectrum of bank IT infrastructures and endpoints. And while banks safeguard against sophisticated phishing attacks across other areas of the network, they cannot afford to ignore the dangers ATMs are prey to. Hackers often view ATMs as easy access to a bank’s infrastructure. And while unauthorised access might not always be preventable, restricting the extent of this infiltration is key.

For example, hacking using high-jacked employee credentials has become prevalent in recent years. This issue can be mitigated by centrally securing privileged credentials, with multi-factor authentication, and controlling network access based on need. Therefore, hackers are restricted in terms of their mobility through the environment and the extent to which they can compromise security controls and access capital.

Vigilance for prevention

Moreover, there is an onus on banks to constantly monitor for threat risks. This should involve a holistic approach to how vulnerabilities are identified and should include ATMs as a first line of defence. By constantly monitoring events and patterns, it becomes easier to spot irregularities and unusual activity – for instance, those originating from the unauthorised use of employee credentials. If vigilance is consistent reaction times can become quicker to prevent the syphering of data or access to cash funds by hackers.

Today, more than ever, there is a need for banks and businesses to recognise that ATMs require the same levels of rolling security provision and upgrading as every other aspect of their infrastructure. Like all other forms of cyber-crime, ATM attacks are changing and adapting all the time. It is therefore essential for banks to understand this threat and to keep the integrity of their ATM security one step ahead.

David Higgins, Director of Customer Development EMEA, CyberArk
Image source: Shutterstock/GlebStock

Activity Recognition for a Smartphone and Web-Based Human Mobility Sensing System

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:06
Activity-based models in transport modeling and prediction are built from a large number of observed trips and their purposes. However, data acquired through traditional interview-based travel surveys is often inaccurate and insufficient. Recently, a human mobility sensing system, called Future Mobility Survey (FMS), was developed and used to collect travel data from more than 1,000 participants. FMS combines a smartphone and interactive web interface in order to better infer users activities and patterns. This paper presents a model that infers an activity at a certain location. We propose to generate a set of predictive features based on spatial, temporal, transitional, and environmental contexts with an appropriate quantization. In order to improve the generalization performance of the proposed model, we employ a robust approach with ensemble learning. Empirical results using FMS data demonstrate that the proposed method contributes significantly to providing accurate activity estimates for the user in our travel-sensing application.

Towards Musicologist-Driven Mining of Handwritten Scores

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
Historical musicologists have been seeking objective and powerful techniques to collect, analyze, and verify their findings for many decades. The aim of this study was to show the importance of such domain-specific problems to achieve actionable knowledge discovery in the real world. Our focus is on finding evidence for the chronological ordering of J.S. Bachs manuscripts, by proposing a musicologist-driven mining method for extracting quantitative information from early music manuscripts. Bachs C-clefs were extracted from a wide range of manuscripts under the direction of domain experts, and with these, the classification of C-clefs was conducted. The proposed methods were evaluated on a dataset containing over 1000 clefs extracted from J.S. Bachs manuscripts. The results show more than 70% accuracy for dating J.S. Bachs manuscripts. Dating of Bachs lost manuscripts was quantitatively hypothesized, providing a rough barometer to be combined with other evidence to evaluate musicologists hypotheses, and the usability of this domain-driven approach is demonstrated.

Revealing Psycholinguistic Dimensions of Communities in Social Networks

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
In this paper, the authors seek to answer one fundamental question - what brings people together to form a community? In this article, they explore the personalities (psychological) and values (sociological) of individuals in social network communities in order to understand such natural selection.

Camera Placement Based on Vehicle Traffic for Better City Security Surveillance

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
Security surveillance is important in smart cities. Deploying numerous cameras is a common approach. Given the importance of vehicles in a metropolis, using vehicle traffic patterns to strategically place cameras could potentially facilitate security surveillance. This article constitutes the first effort toward building the link between vehicle traffic and camera placement for better security surveillance.

Investigation on Unconventional Synthesis of Astroinformatic Data Classifier Powered by Irregular Dynamics

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
This paper discusses the mutual combination of the unconventional algorithm (in this case evolutionary algorithms), deterministic chaos and modeling on real data from astrophysics. Analytical programming with selected evolutionary algorithm is used to synthesize suitable models. The main attention in this paper is focused on various chaotic generators, which are used instead of classical pseudo-random number generators. Chaotic generators are used in conjunction with a special case - a generator based on a strange non-chaotic attractor. The performance of all chaotic and non-chaotic based generators is then mutually compared at the end.

From Raw Data to Smart Manufacturing: AI and Semantic Web of Things for Industry 4.0

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
AI techniques combined with recent advancements in the Internet of Things, Web of Things, and Semantic Web-jointly referred to as the Semantic Web-promise to play an important role in Industry 4.0. As part of this vision, the authors present a Semantic Web of Things for Industry 4.0 (SWeTI) platform. Through realistic use case scenarios, they showcase how SweTI technologies can address Industry 4.0s challenges, facilitate cross-sector and cross-domain integration of systems, and develop intelligent and smart services for smart manufacturing.

Detecting Personal Intake of Medicine from Twitter

IEEE Intelligent Systems - Thu, 10/18/2018 - 18:05
Mining social media messages such as tweets, blogs, and Facebook posts for health and drug related information has received significant interest in pharmacovigilance research. Social media sites (e.g., Twitter), have been used for monitoring drug abuse, adverse reactions to drug usage, and analyzing expression of sentiments related to drugs. Most of these studies are based on aggregated results from a large population rather than specific sets of individuals. In order to conduct studies at an individual level or specific groups of people, identifying posts mentioning intake of medicine by the user is necessary. Toward this objective we develop a classifier for identifying mentions of personal intake of medicine in tweets. We train a stacked ensemble of shallow convolutional neural network (CNN) models on an annotated dataset. We use random search for tuning the hyper-parameters of the CNN models and present an ensemble of best models for the prediction task. Our system produces state-of-the-art results, with a micro-averaged F-score of 0.693. We believe that the developed classifier has direct uses in the areas of psychology, health informatics, pharmacovigilance, and affective computing for tracking moods, emotions, and sentiments of patients expressing intake of medicine in social media.

UK businesses struggling to juggle demands of new technology

IT Portal from UK - Thu, 10/18/2018 - 08:00

Businesses are caught between a rock, a hard place, and another analogy of being up against a wall and nowhere to go when it comes to balancing their priorities, according to a new report. 

Research from Fujitsu found that the majority of businesses are struggling to balance between what their employees want, what their customers want, and what the wider society wants.

All three are essentially for business success (more or less – depends on where you look at), but these three don’t always have their desires or expectations aligned.

Overall, Fujitsu's report says that UK businesses prioritise the needs of their customers over employees and the society as a whole, at the moment.

Two thirds (66 per cent) of all respondents, globally, believe that making a positive impact on society is key to future success. At the same time, 77 per cent believe the organization must put the customer first, if it wants to be successful

“The world has transformed dramatically in recent years with organisations across all sectors feeling the effects of this change," said Rupal Karia, head of public and private sector for Fujitsu UK & Ireland.

"More than ever, UK business leaders are recognising that true success will come from adopting a holistic approach: serving customers’ needs, unlocking the full potential of their workforce and ensuring a positive societal impact. It has become clear that the three groups are intrinsically connected with an organisation’s ability to influence the wider world through the employees and customers they engage with. Yet worryingly, throughout the study, there is uncertainty about the future and nervousness about whether their organisation is taking the right approach to each group. The pressure is on – as all three have huge influence on the future success of an organisation, it’s understandable that world business leaders find it challenging to manage this.”

For many, digital technology can provide the necessary solutions. Almost three quarters (73 per cent) stated that “building and maintaining a creative and collaborative culture is key to having a positive influence on society.” 

You can find the full report on this link.

Image Credit: Rawpixel.com / Pexels

Pages