In the video below I outline five reasons why I think NFC chip implants are a bad idea.
As a quick background: I had an NFC chip implanted and used it as a boarding pass when traveling. I recorded a video showing what it looked like when I had the chip implanted and also when I used at the airport. I did this to find out whether or not it’s a feasible usage of NFC technology. My conclusion is that it is not. (Check out the original video here…). However, I did learn a lot from a general innovation and experimentation perspective. I’ll come back to that in a later post.
In summary, the five reasons why NFC chip implants are a bad idea include:
- Solves no real problem
- Doesn’t work well
- Takes more time (compared with using tags not implanted, or in the case of boarding passes – the old kind, paper or device)
- Not possible to use the NFC chip for more than one thing at the time (in the majority of solutions)
- Serious health risks
Note: The list does not include irrational fear relating to integrity and privacy, since NFC chips cannot be remotely monitored or controlled. The “NF” stands for Near Field. Also, the list does not either include any irrational pseudo-Christian criticism. I have commented on this already in the Comments-section in the original video and won’t engage any further on that topic here.
1998: “We need a homepage!”
2008: “We need an app!”
2018: “We need a chatbot!”
Most know it’s inevitable. Few really know why. Fewer know how.
Every ten years, we seem to leverage new technology driven curiosity. We iterate through discover, pilot, learn. Eventually we end up in sustainable business impact.
My advise is to not oversell the first few projects. Believe in them and enjoy the ride of piloting, but don’t overspend. Don’t kill the bot baby, instead evolve it. Try with real users, real customers, real people.
As with the first homepages (1998) and apps (2008), some won’t see any use and some will even mock. Don’t be the latter. Focus on usefulness and push on.
Some bots today are obviously just simple IF-statements, just as some of the first homepages just were static brochures. Other bots are just a simplified menu system, exactly as some of the first apps. Eventually and if you sincerely focus on making this happen, your bot will deliver on the true cognitive promises of being:
- really understanding both what you say and mean, in context
- really learning from its conversations
- really being able to connect to unstructured and unknown data sources on its own
My advise for the initial few phases:
- Start with a narrow but business relevant scope
- Identify what is new and what is not; for example natural language processing is new (compared with apps), context relevance is not. Learn and/recruit for the new.
- Involve real users / customers
- Understand the importance of architecture
- Master data management
- Omnichannel enabled digital platform, APIs
- Don’t be afraid to experiment with multiple AI/ML/cognitive platforms at the same time
- IBM Watson, Microsoft Cortana and Azure Bot Service, Google Cloud AI and Prediction, Dialogflow, Wit.ai, Amazon Lex, to name a few leading alternatives
… and most importantly, have fun. Life’s too short to not enjoy the ride.
I won’t make this into a long post on the importance of addressing the issues of gender diversity, especially in our industry. I just want to make public that I, and many with me, note how well IBM succeeds in putting women experts and leaders on stage, and in the spotlight. This should be normal and therefore, this post should have been irrelevant. (Note: Yes, I see that the photos lack illustrations of racial/ethnic diversity, but I can assure you presenters on stage were of many ethnic backgrounds. It just wasn’t the point of this post.)
This is my last day at IBM InterConnect 2017. The past two days I’ve attended sessions including; developer driven innovation with Watson, using cognitive services to help fight cyberbullying, and Blockchain implementations.
For me, the most valuable learnings relate to the inner workings of the Watson services platform, for example Conversation Service, Visual Recognition, Virtual Agent, Discovery Service, Knowledge Studio, Natural Language, Retrieve and Rank, and Tone Analyzer. How this all fit together, the reference model, can be explained by organizing services into:
- Foundational cognitive skills
- Higher reasoning skills
- Knowledge organization skills
… and understand how developer tooling and content tooling relate to each other. On this last day, I’m looking forward to attend the following sessions:
- Building an Integrated AI Service Desk Agent
- Roadmap for Next Generation Entrepreneurs
- Humanism of Technology
- Watson in Recruitment to Improve Candidate Quality
Below some photos to give you some idea about what it looks like during and in-between sessions!