Generative Data Intelligence

Tag: symptoms

How to Increase Your Esports Wellbeing using the PERMA Model

Introduction Currently within esports there is a need to develop more holistic wellbeing practices and strategies to support all levels of gamers. With esports continuing to grow in popularity and with more professional and semi-professional teams entering the arena there is a growing responsibility of these teams to look after the wellbeing of their esports […]

Coronavirus SMS scam offers home PCR testing devices – don’t fall for it!

Free home PCR devices would be technological marvels, and really useful, too. But there aren't any...

Eisai: Update on the Phase 4 ENVISION Confirmatory Study of ADUHELM

Cambridge, MA, Jan 28, 2022 - (JCN Newswire) - Biogen Inc. and Eisai Co., Ltd. (Tokyo, Japan) today announced additional details about the Phase 4 post-marketing confirmatory study, ENVISION, of ADUHELM (aducanumab-avwa) 100 mg/mL injection for intravenous use in early Alzheimer's disease, including details of the study's goal for diverse enrollment and primary endpoint.

Biogen aims to enroll 18 percent of U.S. participants in ENVISION from Black/African American and Latinx populations. This goal is reflective of Biogen's ongoing commitment to increase diversity in clinical trials.

"Historically, patients from diverse backgrounds have been poorly represented in Alzheimer's disease clinical trials, and we are committed to changing this," said Priya Singhal M.D., M.P.H., Head of Global Safety & Regulatory Sciences and interim Head of Research & Development at Biogen. "This goal matches the diversity among Americans diagnosed with early Alzheimer's disease, while at the same time, the trial will generate substantial data to verify the effectiveness of ADUHELM."

Biogen will implement multiple strategies to help overcome barriers to diverse patient enrollment in Alzheimer's disease trials, such as, lack of access to medical centers, familiarity with benefit/risk profile of treatment, and financial or logistical burdens.

"It's important to see this ambitious focus on diversity being prioritized in enrollment and integrated as a key part of the ENVISION clinical trial, so that we can have data from patients who more closely represent what we see in the clinic," said Dylan Wint, M.D., Cleveland Clinic Lou Ruvo Center for Brain Health, Nevada.

The companies also announced today that the primary endpoint for the global, placebo-controlled ENVISION trial will be measured by the Clinical Dementia Rating Sum of Boxes (CDR-SB) at 18 months after treatment initiation with ADUHELM. The CDR-SB endpoint is a validated measure of both cognition and function that is widely used in clinical trials of patients with early symptomatic Alzheimer's disease, is consistent with ADUHELM's Phase 3 EMERGE and ENGAGE studies, and capable of generating robust outcomes. The update also includes an increase in the previously announced enrollment, from 1,300 to 1,500 people with early Alzheimer's disease (Mild Cognitive Impairment due to Alzheimer's disease and mild Alzheimer's disease), with confirmation of amyloid beta pathology, to further strengthen the data provided by the study.

Although ENVISION and other ADUHELM clinical trials are already planned or underway, the Centers for Medicare and Medicaid Services (CMS) recently released a draft National Coverage Determination (NCD), which would restrict Medicare coverage of ADUHELM and other amyloid-targeting therapies to patients enrolled in additional clinical trials. Biogen is committed to engaging with CMS to avoid unnecessary duplication of clinical trials and work towards finding a path to offer immediate access to patients to the first FDA approved treatment for Alzheimer's disease since 2003.

In addition to the primary endpoint, CDR-SB, secondary endpoints include Alzheimer's Disease Assessment Scale-Cognitive Subscale (ADAS-Cog 13), Alzheimer's Disease Cooperative Study - Activities of Daily Living Inventory - Mild Cognitive Impairment Version (ADCS-ADL-MCI), Integrated Alzheimer's Disease Rating Scale (iADRS), Mini-Mental State Examination (MMSE) and Neuropsychiatric Inventory (NPI-10).

The initiation of patient screening for ENVISION is planned for May 2022. Based on enrollment rates from the previous Phase 3 trials with ADUHELM, the primary completion date is expected to be approximately four years after the study begins. The companies are grateful to the healthcare professionals, medical centers, patients and families who will participate in this trial.

Previously, in July 2021(New Window), the companies set another substantial diversity goal in the observational Phase 4 ICARE AD trial, which aims to enroll a total of approximately 6,000 patients.

About ADUHELM (aducanumab-avwa) 100 mg/mL injection for intravenous use

ADUHELM is indicated for the treatment of Alzheimer's disease. Treatment with ADUHELM should be initiated in patients with mild cognitive impairment or mild dementia stage of disease, the population in which treatment was initiated in clinical trials. There are no safety or effectiveness data on initiating treatment at earlier or later stages of the disease than were studied. This indication is approved under accelerated approval based on reduction in amyloid beta plaques observed in patients treated with ADUHELM. Continued approval for this indication may be contingent upon verification of clinical benefit in confirmatory trial(s).

ADUHELM is a monoclonal antibody directed against amyloid beta. The accumulation of amyloid beta plaques in the brain is a defining pathophysiological feature of Alzheimer's disease. The accelerated approval of ADUHELM has been granted based on data from clinical trials showing the effect of ADUHELM on reducing amyloid beta plaques, a surrogate biomarker that is reasonably likely to predict clinical benefit, in this case a reduction in clinical decline.

ADUHELM can cause serious side effects including: Amyloid Related Imaging Abnormalities or "ARIA". ARIA is a common side effect that does not usually cause any symptoms but can be serious. Although most people do not have symptoms, some people may have symptoms such as: headache, confusion, dizziness, vision changes and nausea. The patient's healthcare provider will do magnetic resonance imaging (MRI) scans before and during treatment with ADUHELM to check for ARIA. ADUHELM can also cause serious allergic reactions. The most common side effects of ADUHELM include: swelling in areas of the brain, with or without small spots of bleeding in the brain or on the surface of the brain (ARIA); headache; and fall. Patients should call their healthcare provider for medical advice about side effects.

As of October 2017, Biogen and Eisai Co., Ltd. are collaborating on the global co-development and co-promotion of aducanumab.

About Biogen

As pioneers in neuroscience, Biogen discovers, develops, and delivers worldwide innovative therapies for people living with serious neurological diseases as well as related therapeutic adjacencies. One of the world's first global biotechnology companies, Biogen was founded in 1978 by Charles Weissmann, Heinz Schaller, Sir Kenneth Murray, and Nobel Prize winners Walter Gilbert and Phillip Sharp. Today, Biogen has a leading portfolio of medicines to treat multiple sclerosis, has introduced the first approved treatment for spinal muscular atrophy, and is providing the first and only approved treatment to address a defining pathology of Alzheimer's disease. Biogen is also commercializing biosimilars and focusing on advancing the industry's most diversified pipeline in neuroscience that will transform the standard of care for patients in several areas of high unmet need.

In 2020, Biogen launched a bold 20-year, $250 million initiative to address the deeply interrelated issues of climate, health, and equity. Healthy Climate, Healthy Lives aims to eliminate fossil fuels across the company's operations, build collaborations with renowned institutions to advance the science to improve human health outcomes, and support underserved communities.

About Eisai Co., Ltd.

Eisai Co., Ltd. is a leading global pharmaceutical company headquartered in Japan. Eisai's corporate philosophy is based on the human health care (hhc) concept, which is to give first thought to patients and their families, and to increase the benefits that health care provides to them. With a global network of R&D facilities, manufacturing sites and marketing subsidiaries, we strive to realize our hhc philosophy by delivering innovative products to target diseases with high unmet medical needs, with a particular focus in our strategic areas of Neurology and Oncology.

Leveraging the experience gained from the development and marketing of a treatment for Alzheimer's disease, Eisai aims to establish the "Eisai Dementia Platform." Through this platform, Eisai plans to deliver novel benefits to those living with dementia and their families through constructing a "Dementia Ecosystem," by collaborating with partners such as medical organizations, diagnostic development companies, research organizations, and bio-ventures in addition to private insurance agencies, finance industries, fitness clubs, automobile makers, retailers, and care facilities. For more information about Eisai Co., Ltd., please visit https://www.eisai.com.

MEDIA CONTACTS
Biogen Inc.
Ashleigh Koss
+ 1-908-205-2572
[email protected]

Eisai Inc. (U.S. Media)
Public Relations Department
+1-201-753-1945

Eisai Co., Ltd. (Media Outside the U.S.)
Public Relations Department
TEL: +81-(0)3-3817-5120


Biogen Safe Harbor
This news release contains forward-looking statements, including statements made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995, about the potential clinical effects of ADUHELM; the potential benefits, safety and efficacy of ADUHELM; results from ENVISION; the treatment of Alzheimer's disease; the anticipated benefits and potential of Biogen's collaboration arrangements with Eisai; clinical development programs, clinical trials and data readouts and presentations; and risks and uncertainties associated with drug development and commercialization. These statements may be identified by words such as "aim," "anticipate," "believe," "could," "estimate," "expect," "forecast," "intend," "may," "plan," "possible," "potential," "will," "would" and other words and terms of similar meaning. Drug development and commercialization involve a high degree of risk, and only a small number of research and development programs result in commercialization of a product. Results in early stage clinical trials may not be indicative of full results or results from later stage or larger scale clinical trials and do not ensure regulatory approval. You should not place undue reliance on these statements or the scientific data presented.

These statements involve risks and uncertainties that could cause actual results to differ materially from those reflected in such statements, including without limitation unexpected concerns that may arise from additional data, analysis or results obtained during clinical trials; the occurrence of adverse safety events; risks of unexpected costs or delays; the risk of other unexpected hurdles; failure to protect and enforce Biogen's data, intellectual property and other proprietary rights and uncertainties relating to intellectual property claims and challenges; risks associated with current and potential future healthcare reforms; product liability claims; third party collaboration risks; and the direct and indirect impacts of the ongoing COVID-19 pandemic on Biogen's business, results of operations and financial condition. The foregoing sets forth many, but not all, of the factors that could cause actual results to differ from Biogen's expectations in any forward-looking statement. Investors should consider this cautionary statement as well as the risk factors identified in Biogen's most recent annual or quarterly report and in other reports Biogen has filed with the U.S. Securities and Exchange Commission. These statements are based on Biogen's current beliefs and expectations and speak only as of the date of this news release. Biogen does not undertake any obligation to publicly update any forward-looking statements, whether as a result of new information, future developments or otherwise.


Copyright 2022 JCN Newswire. All rights reserved. www.jcnnewswire.comBiogen Inc. and Eisai Co., Ltd. (Tokyo, Japan) today announced additional details about the Phase 4 post-marketing confirmatory study, ENVISION, of ADUHELM (aducanumab-avwa) 100 mg/mL injection for intravenous use in early Alzheimer's disease, including details of the study's goal for diverse enrollment and primary endpoint.

“We Are Ready to Go”: Entoprotech Knocks Out the Competition at Uniting Water Energy Food Innovation Finals

Black soldier fly circular economy start-up scores with judges at Expo Dubai DUBAI, United Arab Emirates–(BUSINESS WIRE)–#biotech–Circular economy start-up Entoprotech, which uses Black Soldier Fly Larvae (BSFL) to sustainably convert organic waste into high-value products, is a Global Winner of the Uniting Water Energy Food (UWEF) Innovation Finals. The competition was hosted at Expo Dubai […]

The post “We Are Ready to Go”: Entoprotech Knocks Out the Competition at Uniting Water Energy Food Innovation Finals appeared first on Fintech News.

The Future of Healthcare in the Metaverse

How is the metaverse shaping the medical profession?

Tivic Health Collaborates With Leading Medical Research Hospital on First Controlled Trial to Extend Tivic’s Bioelectronic Platform

60-person clinical trial to investigate therapeutic device as an alternative to opioids following sinus surgeries SAN FRANCISCO–(BUSINESS WIRE)–Tivic Health® Systems, Inc., (Nasdaq: TIVC), a commercial-phase healthtech company focused on bioelectronic medicine, today announced initiation of its second clinical study investigating the extension of the company’s bioelectronic platform, offering a potentially new alternative to opioids for […]

The post Tivic Health Collaborates With Leading Medical Research Hospital on First Controlled Trial to Extend Tivic’s Bioelectronic Platform appeared first on Fintech News.

Eisai’s Anti-amyloid Beta Protofibril Antibody Lecanemab Selected as the Background Therapy for the Tau Nexgen Study

TOKYO, Jan 19, 2022 - (JCN Newswire) - Eisai Co., Ltd. announced today that the Dominantly Inherited Alzheimer Network Trials Unit (DIAN-TU), led by Washington University School of Medicine in St. Louis, has enrolled the first subject in the phase II/III study (Tau NexGen study). The study will assess the effect of Eisai's investigational anti-microtubule binding region (MTBR) tau antibody E2814, in dominantly inherited Alzheimer's disease (DIAD).

People who have genetic mutations of DIAD are known to develop Alzheimer's disease (AD) and will likely develop symptoms at around the same age their affected parents did, often in their 50s, 40s or even 30s. The major AD pathologies are amyloid plaque that consists of amyloid beta (Abeta) aggregates; neurofibrillary tangles; and intraneuronal aggregates of tau, all of which are believed to spread throughout the brain.

The purpose of the Tau NexGen study is to assess the safety, tolerability, biomarker and cognitive efficacy of investigational therapies in pre-symptomatic or symptomatic participants who have an AD-causing gene mutation. In March 2021, the DIAN-TU selected E2814, which was created from a research collaboration between Eisai and University College London, as the first investigational medicine among anti-tau drugs for the Tau NexGen study. With increasing evidence from clinical studies showing that targeting amyloid can reduce biomarkers of AD, the Tau NexGen clinical trial leaders selected Eisai's investigational anti-Abeta protofibril antibody lecanemab (BAN2401) as the background anti-amyloid therapy, and the study design was amended in November 2021.

Eisai positions neurology as a key therapeutic area, and it will continue to create innovation in the development of novel medicines based on cutting-edge neurology research as it seeks to contribute further to improving the benefits of affected individuals and their families in diseases with high unmet needs, such as dementia including AD.

About Dominantly Inherited Alzheimer Network (DIAN)

The DIAN is an international research effort focused on dominantly inherited Alzheimer's disease. Dominantly Inherited Alzheimer's disease (DIAD) is a rare form of Alzheimer's disease (AD) that causes memory loss and dementia in individuals -- typically while they are in their 30s to 50s. The disease affects less than 1% of the total population of people with AD. The aim of the Dominantly Inherited Alzheimer Network Trials Unit (DIAN-TU) is to find solutions to treat or prevent this disease and, potentially, all forms of Alzheimer's. The DIAN-TU is an international public-private partnership dedicated to designing and managing interventional therapeutic trials for individuals with and at risk of DIAD.

About Tau NexGen study

The purpose of the Tau NexGen study is to assess the safety, tolerability, biomarker and cognitive efficacy of investigational therapies in people who have an AD-causing gene mutation. In the Tau NexGen study, symptomatic participants will be administered anti-amyloid beta (Abeta) protofibril antibody lecanemab for six months before being randomly assigned to also receive the anti-tau drug or a placebo. Since amyloid plaques accumulate before tau tangles in AD, this study design allows the researchers to assess whether amyloid removal clears the way for the anti-tau drug to function most effectively. Pre-symptomatic participants will be randomly assigned to receive the anti-tau drug or a placebo for a year before beginning lecanemab administration. By staggering the drugs in this way, the researchers will be able to evaluate the effects of the anti-tau drug alone before assessing the effects of the two drugs together. If the primary and secondary endpoints are positive in the analysis two years after the start of study, the study will be extended for another two years to assess whether the drug slows cognitive decline and has further effects on tau pathology.

About E2814

An investigational anti-microtubule binding region (MTBR) tau antibody, E2814 is being developed as a disease modifying agent for tauopathies including sporadic AD. Phase I clinical studies are underway. E2814 was discovered as part of the research collaboration between Eisai and University College London. E2814 is designed to prevent the spreading of tau seeds within the brains of affected individuals.

About Lecanemab (BAN2401)

Lecanemab is an investigational humanized monoclonal antibody for AD that is the result of a strategic research alliance between Eisai and BioArctic. Lecanemab selectively binds to neutralize and eliminate soluble, toxic Abeta aggregates (protofibrils) that are thought to contribute to the neurodegenerative process.

in AD. As such, lecanemab may have the potential to have an effect on disease pathology and to slow down the progression of the disease. With regard to the results from pre-specified analysis at 18 months of treatment, Study 201 demonstrated reduction of brain Abeta accumulation (P<0.0001) and slowing of disease progression measured by ADCOMS* (P<0.05) in early AD subjects. The study did not achieve its primary outcome measure** at 12 months of treatment. The Study 201 open-label extension was initiated after completion of the Core period and a Gap period off treatment (average of 24 months) to evaluate safety and efficacy, and is underway.

Eisai obtained the global rights to study, develop, manufacture and market lecanemab for the treatment of AD pursuant to an agreement concluded with BioArctic in December 2007. In March 2014, Eisai and Biogen entered into a joint development and commercialization agreement for lecanemab and the parties amended that agreement in October 2017. Currently, lecanemab is being studied in a pivotal Phase III clinical study in symptomatic early AD (Clarity AD), following the outcome of the Phase II clinical study ( Study 201). In July 2020 the Phase III clinical study (AHEAD 3-45) for individuals with preclinical AD, meaning they are clinically normal and have intermediate or elevated levels of amyloid in their brains, was initiated. AHEAD 3-45 is conducted as a public-private partnership between the Alzheimer's Clinical Trial Consortium that provides the infrastructure for academic clinical trials in AD and related dementias in the U.S., funded by the National Institute on Aging, part of the National Institutes of Health, and Eisai.

In September 2021, a rolling submission to the FDA of a Biologics License Application (BLA) for the treatment of early AD under the accelerated approval pathway was initiated. Lecanemab was granted Breakthrough Therapy designation in June 2021, a U.S. Food and Drug Administration (FDA) program intended to expedite the development and review of medicines for serious or life-threatening conditions.

* Developed by Eisai, ADCOMS (AD Composite Score) combines items from the ADAS-Cog (Alzheimer's Disease Assessment Scale-cognitive subscale), CDR (Clinical Dementia Rating) and the MMSE (Mini- Mental State Examination) scales to enable a sensitive detection of changes in clinical functions of early AD symptoms and changes in memory.
** An 80% or higher estimated probability of demonstrating 25% or greater slowing in clinical decline at 12 months treatment measured by ADCOMS from baseline compared to placebo

Media Inquiries:
Public Relations Department, Eisai Co., Ltd.
+81-(0)3-3817-5120

Eisai Inc (U.S.) Libby Holman 201-753-1945
[email protected]

Investor Contact:
Eisai Co., Ltd.
Investor Relations Department
TEL: +81-(0)70-8688-9685


Copyright 2022 JCN Newswire. All rights reserved. www.jcnnewswire.comE isai Co., Ltd. announced today that the Dominantly Inherited Alzheimer Network Trials Unit (DIAN-TU), led by Washington University School of Medicine in St. Louis, has enrolled the first subject in the phase II/III study (Tau NexGen study).

TechCrunch – Startup and Technology News

Attention deficit hyperactivity disorder (ADHD) symptoms can include anxiety, chronic boredom, impulsiveness, trouble concentrating, controlling anger, and even depression. But ADHD suffers can face l

nCore Games has raised $10 million in a new financing round as the top Indian gaming firm gears up to launch web3 offerings, its top executives said Monday. Animoca Brands and Galaxy Interactive, two

Cash flow is a major pain point for small businesses in Africa. Long payment cycles, which can take 30-90 days after services or products have been rendered, and little or no capital, of which researc

Customer service tickets tend to fall into the same few buckets — returns, refunds or quality control questions. Not only are these repetitive, but it leaves little time for customer engagement on m

Barcelona-based Payflow, a YC-backed salary-advance fintech with ambitions to evolve into a neobank, has banked a $9.1 million Series A funding round — bringing its total raised since January 20

London-based legaltech startup Juro has grabbed $23 million in Series B funding for its browser-based contract automation platform. In total the startup has raised $31.5M since being founded back in

Asaak, a Ugandan asset financing startup, has secured $30 million in pre-Series A equity and debt funding. The round saw the participation of new and existing investors including Resolute Ventures, So

Tado, the German smart home startup that specializes in thermostats and more recently moved into flexible “time of use” energy tariffs based on loadshifting technology, is today announcing

Misty has had a hell of a journey of it. After raising $11.5 million from Foundry and Venrock, then crowdfunding its personal robot, the company hit a series of challenges. Today, Swedish social robot

Supply chain logistics — getting components and eventually finished products from A to B to C — is one of the most critical parts of running a business, not least because it is one of the

French startup Exotec has raised a $335 million Series D round in a new round of funding led by Goldman Sachs’ Growth Equity business. Following today’s investment, the company has reached a valua

Welcome to Startups Weekly, a fresh human-first take on this week’s startup news and trends. To get this in your inbox, subscribe here. Among many of the entrepreneur catchphrases out there, the on

Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest in mobile OS news, mobile applications and the overall app economy. The app industry continues to grow, with a re

Welcome back to The TechCrunch Exchange, a weekly startups-and-markets newsletter for your weekend enjoyment.

I worry we’re at risk of conflating technological growth for the social progress we really need. Women are still behind. Why?

Hello friends and welcome to Daily Crunch, bringing you the most important startup, tech and venture capital news in a single package.

Singapore-based industrial robotics firm Sesto this week announced a $5.7 million raise, featuring TRIVE, WTI GmbH and SEEDS Capital (Enterprise Singapore’s VC wing). The round follows a similarly s

Following news that the FTC’s antitrust suit against Meta cleared a critical hurdle earlier this week, the agency is apparently also taking a sharp interest in the company’s VR business. B

Whatever you do, don't refer to LG's incubator program LG Nova as a corporate venture capital (CVC) outfit.

Korean technology giant LG makes everything from televisions (they announced some new ones at CES), washing machines and fridges, to, well, it’ll probably take less time to list the things they

Load More

News

1/16/2022

Comments

The co-founder of Google Brains, Andrew Ng, commented that “massive data sets aren’t essential for AI innovation.” Some years ago, I spoke with a person from a tech giant that also wanted to get into the data business. I asked him what data they wanted to collect and how it would be used. His answer was to get all possible data and then find a way to utilize it. His response says a lot about the data business.

Many companies want to start their data journey with a massive IT project to collect and store lots of data. Then the discussion is easily about IT architecture, tool selection and how to build all integrations. These projects take up large amounts of time and resources.

What is rarely considered is the real value we want from the data. And even if we have a plan for that, it can be forgotten during months or years of IT architecture, integration and piping projects. These projects are not run by people who want to utilize data; they are often run by IT bureaucrats.

Mr Ng also commented that people often believe you need massive data to develop machine learning or AI. There seems to be a belief that quantity can compensate for quality in data analytics and AI. I remember having a discussion with a wearable device company when their spokesperson claimed you needed data from millions of people to find anything useful for building models.

There are use cases where big data is valuable. Still, the reality is that in many use cases, you can extract considerable value from small data sets, especially if the data is relevant. We can also think of horizontal and vertical data sets, e.g. do we want to analyze one data point from millions of people or numerous data points from a small number of people. With the horizontal and vertical data, I don’t mean how they are organized in a table, but the horizontal approach to collect something from many objects, e.g. heart rate from millions of people, versus the vertical approach of having more data from fewer objects, e.g. a lot of wellness data from a smaller sample group.

But does it help to understand an individual’s wellness, sleep and health better? Looking at wellness data as an example illustrates the question well (no pun intended). A wearable device collects steps, heart rate and sleeping time from millions of people. We can then analyze this data to determine if more steps and a higher heart rate during the day predicts the person sleeps longer that night. Then we can find a model that predicts similar outcomes for other people.

We can take another model to build analytical models. One individual uses more wearable devices, for example, to collect the usual exercise, heart, and sleep data, but also blood pressure, blood glucose, body temperature, weight and some disease data. Now we might get different results about heart rate, step and sleep relationship. We might see that their relationship depends on other variables, e.g. high blood glucose or blood pressure changes the pattern that works for healthy people. These findings can be determined from a small number of people.

The examples above are not intended to make any conclusions about what is relevant to analyze health. Those conclusions must be drawn from the data itself, but it illustrates how it is possible to take different approaches and get quite different results. Wearable data at the moment is a good example of big data thinking; the target has been to collect a few data points from millions and millions of people and then just train data models to conclude something from it, although we don’t know how relevant those data points are. It is also possible to build models from rich data of a few individuals, and actually, it can be an exciting and valuable AI modelling task.

Of course, there are also cases where data models can be built from a massive amount of data even though we don’t know if it is relevant. For example, this podcast talks about hedge funds that try to collect all kinds of data and then build models to see if they can predict stock market movements. This includes much more than traditional finance data for investments. For example, how people buy different kinds of food, watch streaming content, and spend their free time and then find ‘weak signals’ to predict trends and their impact on the investment market. So, compared to many other data analytics cases, it is different because it doesn’t focus on analyzing particular detail but randomly collecting all kinds of data to see if it can find something relevant from it, hoping to find any new variables that could give a competitive advantage.

In most use cases, utilizing data and building AI would be important in understanding the need and target. Relevant data can be then chosen based on actual needs and testing which data matters. Small but relevant data can produce a useful AI model. This typically requires the context to be taken into account, not only a lot of random data points collected with a model built. Whatever data you have, you can always build a model, but it doesn’t guarantee the model makes any sense. Companies and developers should focus more on relevant data than big data.

12/19/2021

Comments

Enterprises have been moving their services to the cloud for several years. Peer-to-peer (P2P) services have become well-known, especially with blockchain and crypto. But individual users haven’t really used personal clouds, and the number of real P2P services is still quite limited. But this could soon change.

I wrote earlier about decentralized solutions. Personal clouds and P2P apps are examples of distributed applications mentioned in that article. But let’s take a more concrete and pragmatic approach, what these applications could be and how they work.

I was recently demonstrated some services that are basically apps that users can run locally in their own browser and have data either stored locally or in the user’s own cloud.

  1. A person can play with someone else in a battleship game, and both people run it only in their own browser. There is no central server for the game. Matchmaking can happen in a pure serverless fashion when a user broadcasts a message that she or he is willing to play.
  2. Messaging service between two users that don’t use any central server for the connections but purely messages between two applications.
  3. Two users can identify each other by having their own identity only locally, and then they can start encrypted communications without any third parties.

These examples might sound simple, but they could be the start of a big revolution in applications and even how the internet is used. Of course, the very fundamental protocol of the Internet, TCP/IP, is based on packets routed from A to B. But in practice, most services during the last three decades have been based on client-server configurations, not local services and/or direct connections between users.

These services raise several technical questions about whether usability would be good enough for mainstream users. For example, users can already set up a connection by sending an invitation with a traditional email and then making the P2P connection with local credentials and sending messages directly or through centralized services like email or messaging apps.

An interesting combination occurs when services use the user’s local applications and the user’s own cloud or similar storage services. It is hard to store and organize all of a user’s data locally when using several devices. However, the scenario changes if users have their own storage services and can get the needed data and apps from there for local use when needed. This storage is not a third-party central service but the user’s own service in a broader global infrastructure.

It sounds complicated, but does this really matter? With blockchain and crypto, we have seen how users can make transactions directly without third parties. It has enabled reliable payments anonymously without an authority or central service to track all transactions. It can offer a more reliable system, better privacy and no single point of failure.

But with these user services and P2P connections, we can do much more than simple crypto payments. Let’s take some examples:

  1. Users can keep all their data in their own services, refine, enrich and utilize that data with their local applications and then share some data, case by case, with other users or service providers.
  2. High security and privacy user-to-user communications without any third parties.
  3. Personal identities that users manage themselves that are not based on third-party authentication services, where two users can identify each other directly and start secure communications.

Blockchain and tokens have received a lot of attention, but the examples above better demonstrate distributed applications and peer-to-peer communication. Blockchain and tokens can also be a part of these services. Blockchain could provide a ledger to keep track of transactions and tokens as a model to monetize distributed services. But they are not services alone. It is fundamental to have applications and services that are valuable to users, and then we can use blockchain and tokens in the implementation.

The question is, which services will provide the real breakthrough of user’s personal clouds, apps and pure P2P services, and when? They will probably be linked to personal data, self-sovereign identities, trusted communications and data sharing. We just need a few easy to use applications and after that things can start to evolve rapidly.

The article first appeared on Disruptive.Asia.

11/22/2021

Comments

It’s often said people don’t appreciate things they can get for free. Another way of looking at this is that it is difficult to determine its value if you don’t pay for something. With the cost of sending emails or getting contacts in social media virtually zero, does it mean it is harder to get value from them? Should we start to pay for contacts and messages?

Do you remember the time when there was just a landline phone at home? Or when you received letters through the mail? When your phone was ringing, someone definitely answered the call and actually took the call seriously. When you got a letter with your name and stamp on the envelope, it was something you wanted to open and read. Now you get robocalls that use VoIP, making them really cheap. You get a lot of emails, most of which you don’t even open or read.

How about social network contacts? You can send LinkedIn or Facebook invitations to almost anyone, and many people accept invitations from people they don’t know. One could say this has made people better connected and made the world more democratic. Earlier you could have tried to get into an exclusive club and use your contacts to help to arrange an important new introduction. But how much value do your social media contacts actually produce? Not much, and less each day, I would argue.

In an earlier article, I wrote how many social networks had become spamming networks. It’s great that prices go down and more people get access to networks and opportunities. But this also has its side effects. Everything becomes too crowded, and everyone tries to use them for their own purposes. When connections, communications and transactions have minimal or zero cost, people don’t consider using them properly. It leads to a situation where those networks and tools offer less value. It’s a bit like a government starting to print a lot of money. The money loses value, and then you can’t afford to buy things with it.

Would this change, if we had to pay for contacts, messages and transactions? Most probably. It doesn’t mean they should be so expensive that it starts to limit who can use the tools significantly, but it would make people think about what they are doing. Maybe people would start to appreciate more the contacts they have and the messages they receive.

It doesn’t really matter to the users what technology makes transactions payable, but the user experience matters. To get this to work, very simple micro-payments are needed. At the moment, it looks like blockchain and tokens are the strongest candidates to change business models of messaging and social networking services.

This is something that has been talked about since the 2017 ICO boom. The missing piece has been workable, effective end-user services, not just concept ideas. It is not realistic to think that totally new communications tools would replace the existing ones. New solutions to better manage contacts and messages should work, for example, with the existing email and messaging services.

One can also claim that people are not ready to pay for these commodities they have always had for free. And not all people will be ready to do it immediately, but people are happy to pay for things that make their life better, help them with daily tasks and give them greater status. There are many signs that people are now looking for better privacy and control of their data and activities, and security is also becoming more important.

People have always been willing to pay for exclusive clubs. They have been willing to pay for dinners with top politicians and celebrities. If someone you don’t know wanted to message you, you would be more interested in looking at the message if you know they had paid for it, and it was not one of the thousands of ‘free’ messages. If a user only accepted ‘paid’ messages, it would cut down the level of spam, too. Good contacts and important messages are premia, not commodities.

We will soon see services where people pay for messages, not for all messages, but some of them, e.g. to reach new contacts. We will also start to see services where people will have to pay for contacts, and they will have to give serious thought to which contacts they really want to invest in. But these services will need to offer the same usability as chatting, social media and email today. This concept could become one of the first big use cases for blockchain and tokens.

The article first appeared on Disruptive.Asia.

Picture

Private club. (Photo: Wikipedia)

11/15/2021

Comments

Data and computing have moved to a centralized model during the last decade when many services have gone to the cloud. This trend continues, and we will see many more companies go to the cloud. At the same time, we are starting to see a new trend toward more decentralized models. But it is still a combination of different things. It might look like fuzzy development, but it really happens.

There are several reasons why we will see more distributed models for data and processing overall in the future. We can divide them into three main categories:

  1. Technology reasons. Many services (for example, self-driving cars, smart homes, personal assistant apps) require minimal latency and availability of services all the time, and some data and processing must be local.
  2. New business models. Blockchain-based token models open new opportunities to monetize distributed solutions. There is no need to process and charge services in one central place, opening opportunities to many new business model innovations.
  3. Data protection and privacy requirements. Data is becoming a liability for companies, and they want to find better models to use data with a lower risk. Consumers will have services to utilize their own data fully.

I have often said it is not hard to predict the future, but it is hard to know the right timing. It is also the case for this development. There are so many good reasons to have more distributed services that it will happen. But it is not easy to say how it will happen, where it will really take off and how long it will take.

We now see several technologies that make this development real. First, we have Edge that comes into effect with 5G networks. Edge keeps data and processing closer to actual users. The challenge – are network vendors and telecom carriers the right parties to deliver these solutions when Internet giants like cloud vendors now dominate services and service development?

Secondly, we have blockchain, distributed ledger and token models. These are all developing rapidly, but they also have their challenges. It is not easy to say which technology can survive the longest. In this case, it is not only the technology but also the transaction data in those chains that must survive, making it difficult to make decisions about a particular technology. At the same time, these will challenge centralized platforms, as they offer totally new ways to distribute and monetize applications and data.

Thirdly, decentralized solutions can be implemented inside existing cloud solutions. We, of course, have regional cloud instances, but clouds enable other ways to decentralize services. For example, each user can have their own cloud services to use their data and run their own applications. Then, for example, with token-based charging models, they can also pay for using apps locally.

Of those three technologies, Edge has many challenges as it needs totally new infrastructure and applications to take advantage of it. It is currently much easier to make decentralized services by utilizing the current cloud infrastructure. But longer-term solutions can be another story. Technology disruption often attracts new companies that disrupt business. For example, Amazon and Google are tied to centralized models. Can they adapt when decentralization starts to happen and other vendors offer the latest solutions?

We will likely see two different development tracks for decentralized solutions. The first one being distributed and decentralized applications. This starts with the existing infrastructure and builds distributed solutions, such as a user’s own data cloud and application service. This track already has applications. Then we have the second track to develop a more decentralized data and processing infrastructure. This will take a longer time, but it can fundamentally change the structure of the Internet.

We are definitely moving to more distributed services. Thousands of startups are already developing services, data models and applications. Big technology vendors are investing in Edge type models, millions of people are trading cryptos, and forward-looking investors, like Andreessen Horowitz, are making big investments. At the same time, regulations are putting pressure on making new data models. The exciting part will be to see how it will happen, and the parties that make it happen will be the big winners.

The article first appeared on Disruptive.Asia.

11/5/2021

Comments

Some of you might remember when home and personal computers were emerging in the 1980s. Many different companies made their own devices, like Commodore 64, Apple II, Spectravideo 328, Sinclair ZX80 and Atari. Then some manufacturers agreed on standards like MSX that never actually became globally significant. But then personal computers (PCs), with PC-DOS and MS-DOS, started to occupy offices and then homes, and Apple created the only other option. We now have a similar situation with wearable devices.

In the 1980s, most computer manufacturers had their own operating systems and a small range of programs. Early adopters had those devices more as a hobby than to really utilize them. There were all those stories about use cases like recipe databases or calculating your taxes, but only if you code your own program. Many users actually did write their own programs and shared them with other users. As a teenager, I tried to explain to my father the value of owning a computer. It was not an easy task when he didn’t feel that coding your own games or graphics programs were valuable reasons to have the device.

How is this relevant for wearables? We have now more and more wearable device manufacturers that offer their devices with their own proprietary functionality, data models and applications. Many users are still early adopters like biohackers and health enthusiasts that explore ways to utilize the data.

Most users can understand a couple of data points like average heart rate and the number of daily steps. Those are a good start to observe and improve personal health, but it is a small part of the data and the opportunities these devices can offer. Some additional data points like Heart Rate Variability (HRV) and different sleep types (deep, REM and light) are much harder to interpret and utilize daily.

One could ask, as my father did if it makes sense to pay $400 for a device to see heart rate and daily steps. Or why pay over $100 monthly subscription for a glucose measurement device or more expensive shoes to measure cadence, stride length and foot strike angle. For many people, use cases like a device letting you know when to go to sleep sound as naive as a recipe database.

Each manufacturer also has its own scores. For example, sleep and readiness scores from one device are very different to another device, and there is no easy way to combine data from different devices properly. Or you can combine some data, for example, to Apple Health, but it then contains a lot of data points that are even more confusing than data in the device’s own apps.

What changed the computer market? How did they start to become more useful? It happened when software packages started to appear. A couple of operating systems, from Microsoft and Apple, started to dominate, and both systems had enough software being generated by third parties. This evolved into the software industry, making software for personal computers. Actually, we saw a similar development in mobile phones. The mobile application business started to grow only when we got from proprietary systems to two main operating systems, iOS and Android, that enabled application stores to make a business out of applications.

As we have discovered, wearables are not only for data; they can also be accessories and fashion items. A luxury brand could launch its own smartwatch or ring, but luxury brands are not really high tech or data companies. And it is not very convenient for users that each watch, ring, sensor, pair of shoes or jacket offers its proprietary data format and application. It would also be much better for luxury goods companies to have some common data models and ecosystems.

The real utilization and software market for wearable and wellness data can emerge only when we get data from different devices to a compatible format. When we have two or three environments, software developers can make better software and applications to utilize data to help people in their daily lives. We cannot expect each individual to start to interpret all kinds of health data points and try to Google instructions and what to do based on them. A special requirement with wellness and health data is that it is even more sensitive than data for many other purposes, and privacy is crucial.

It has been said that the IoT market won’t really be a hardware business but a data and software business. Wearables will basically be sensors to collect data. Some sensors could be branded devices, others white label components in clothes, shoes or accessories. But the real utilization of data needs environments where the user can combine the data, and software can be offered to users to help them live better and healthier lives. The real business and value of wellness data will be software and applications that can combine all kinds of wearable data with other data sources.

10/27/2021

Comments

We have now waited a decade for the big wave of fintech company launches. People are frustrated with traditional banks and their services. Neobanks grow, but they are still tiny compared to conventional banks. Crowdfunding and P2P lending were going to change the market too, but they are still relatively marginal. Crypto finance grows, but is it a finance model, asset class or speculation?

Stripe and Coinbase have been the big success stories in fintech with huge valuations. On the other hand, the collapse of Greensill Capital in the UK was seen as a setback for fintech. These examples just demonstrate how broad the fintech sector is. In reality, Greensill had nothing to do with fintech, but it wanted to attach that attractive tagline to itself. Greensill was a supply-chain finance service and failed due to its risk management.

There are dozens of digital-only neobanks in the world. It is estimated they have approximately 40 million customers. This is still a very small number, but the valuations of the neobanks have grown rapidly. According to Accenture, a neobank loses on average $11 for each customer, i.e. costs versus returns. They still struggle to find a profitable business model. Basic bank accounts are not profit centers. Lending, investment, and niche services (e.g. business banking, special customer groups) are more common areas to make a profit. Still, they are very different from basic digital accounts, and their risk management is markedly different. 

Some neobanks such as N26, WeBank and Monzo, offer the full-stack, i.e. they have a banking license and have their front and back-end operations. Then there are neobanks, like Revolut and Chime, that have no banking license and offer a front end but use legacy bank licenses and back ends. The banking license is the complex part if we think of scalability and global growth. It is a significant investment to get a banking license in every new country.

Ten years ago, we had a lot of expectations with crowdfunding and peer-to-peer lending. Those services have grown but not yet come to the mainstream. The P2P lending market is growing approximately 25% per year, but a significant part of the money comes from financial institutions that use those services as a customer interface. One of the biggest P2P lending success stories, LendingClub, acquired a bank last year and decided to close down its P2P lending platform.

Crowdfunding has had many models. Startup equity crowdfunding focuses primarily on startup funding and pre-order models, like Kickstarter, that help sell new products before they are available. There are also other models that sell investment fractions in real estate, art and other assets. Kickstarter has been an important test market for new consumer products, but otherwise, these models have suffered from regulatory restrictions and haven’t come to the mainstream. In equity startup crowdfunding, the UK has been the leading country. However, its most significant services Crowdcube and Seedrs, are still small businesses, and they tried to merge, but the competition regulator blocked the merger.

Cryptos hit new records, especially with bitcoin’s growing value. NFT’s have become popular. It is still hard to say what this means for blockchain-based distributed finance services. Many parties still see cryptos as more digital commodities and NFT’s like digital asset certificates, but not yet as challengers to the whole traditional finance system. Coinbase, which managed a successful IPO, is still more like a conventional trading service for ‘crypto assets’, not, e.g. a distributed finance service itself. 

If cryptos become more accepted and feasible with daily payments, and NFT’s make asset certificates and transactions digital, it changes the everyday use of those. Could they better enable crowdfunding and P2P finance? Some experts see it as the case, but it is still hard to say when the last decade has shown those models are not easy to get working. It is not only about technology but really about getting a market to work with enough supply and demand. And regulators also have a significant impact on the market. In some countries, we might see more approaches like government-run digital currencies than genuine cryptos.

Then we have the fundamental question about the banking system. Banks are not just there to offer accounts, payment cards and to lend money; they also have an essential role with the central banks to issue money and keep the economy running. Some people criticize that system and would like to see the power of banks disappear. In reality, it is not so simple, and governments prefer to keep their control over their finance systems. The pandemic time has also reminded us of the value of government stimulus activities. 

Some banks are starting to accept cryptos, and one of the leading VC’s, Andreessen Horowitz, is planning a $1 billion fund for cryptos and blockchain, and it’s third in the sector. We can assume blockchain and distributed solutions will change and digitize daily assets and transactions. The first phase will probably be in digitization, not changing the fundamentals of the finance and banking system. But it is still hard to say how they can change the finance sector and services in the long run.

The article first appeared on Disruptive.Asia.

10/17/2021

Comments

There’s been a growing trend to get rid of middle management, which coincides with the belief that AI and software robots can automate work done by human professionals. So, fewer managers are needed and, therefore, less human resources management. But when we have more machines to work, they also need to be managed. Consequently, we probably need digital counterparts of middle management and human resources.

A recent podcast has an interesting discussion about micro tasks to analyze data and make micro-predictions, such as analysing a specific dataset from one source and trying to conclude something from it. This analysis doesn’t try to understand or optimize a more significant problem or task. It just focuses on one specific part. A more extensive system can have dozens of components like that.

Then there is another layer to combine output from those micro-tasks. It can then combine the output and conclusions from several micro-AI modules—individual micro AI’s focus to model and explain one specific data set. 

For example, running shoe data (yes, there are already running shoes that collect all kinds of data) of your cadence, stride length, ground contact time and foot strike angle to optimize your running speed. When you think about your running performance as a whole, this is only one part. You must also think about heart rate, energy levels (blood glucose), readiness (have you slept enough) and many other things. But it would be too complex to build one huge AI to optimize all this data, and it is better to have modules for each need and then another layer to combine all this.

It is the same with software robots. One robot can transfer inventory numbers at the end of the month from SAP to your accounting system. To produce monthly financial statements and reports is much more work than simply compiling those inventory numbers. Other robots could perform the individual tasks and some higher-level robots to put all this information together.

This is nothing new as such. Modularity has been an essential principle in designing software for decades. With AI and automation, we often talk about extensive and complex solutions linked to many tasks and systems around an enterprise. When these are also relatively new areas, each company and project usually try to build large systems that try to make a perfect solution for a significant process. 

When we have these micro modules to handle a specific need, we can then develop design principles. Not to implement from scratch, but to find the best components to do micro-tasks, optimize their use and then get them to work together. It is a kind of HR and management function. You must find the best resources to do things you need, and then you must manage them. But these management layers are digital, i.e., algorithms that choose the best algorithm for each micro-need and optimally use them. Algorithms manage algorithms.

This also changes the ecosystem and business models for AI and automation. You have, for example, the following business areas:

  1. Open source and crowdsourcing communities develop a lot of solutions for all kinds of micro-tasks. There can be several versions for each need, and the most suitable one can be selected for each case.
  2. Companies and developers can also start selling their components for micro-needs. They can focus on developing the best solutions for a particular need.
  3. APIs in each module become more critical when making it more frictionless to get several modules to work together.
  4. There will be marketplaces and sharing services (like  GitHub) to share and sell these components.
  5. It will be a new important function layer that can use micro-modules that work together in an optimal way for different needs.
None of these models is new as such. In the Automation and AI industries, many of these functions are still in an early phase. Many tools in those areas are still based on a proprietary closed ecosystem model that doesn’t enable or support this more open approach. Developers just need to find a more effective way to think about the implementation, and companies a new way to think about business and offers. All must be based on open ecosystems and layered implementation.

Sometimes it is good to compare technology and machines to models of how human beings perform jobs. Especially when AI and automation are to perform tasks that humans have previously done. People have anyway used centuries to develop models, how to organize tasks in organizations. It doesn’t mean we can or should copy the same models to machines, but it can give us ideas on how best to use machines. There are reasons why people specialize in certain areas, how different professionals work together and how the management layer must optimize resources. We need to solve similar issues when designing, using, and managing algorithms, machines, and digital processes.

The article first appeared on Disruptive.Asia.

Photo source: Wikipedia. 

9/26/2021

Comments

Numerous cities around the world want to become ‘smart’ cities. One main objective of smart cities is to collect data to improve and develop services. As a result, many vendors are also keen to get to the smart city business. These projects are network, infrastructure and big data intensive. So how does this benefit ordinary people? Any value to individuals and their privacy seem to have a lower priority, although the ultimate target should surely be to improve the lives of residents.

Smart city concepts started to trend some years ago and are increasing in popularity. 5G and Edge also are seen as essential technology boosts for those projects, and that’s why network vendors and carriers are involved in most projects. Smart cities are seen as a good reason to build technology infrastructure to collect, transfer and analyze all that data.

Cities target to collect data and analyze it to optimize services and operations for many purposes, such as traffic management, public transportation, power consumption and production, water supply, waste collection, crime reduction, healthcare and community services. Environmental aspects are also becoming more critical. Air quality, noise pollution and consumption of energy are other areas cities want to improve.

This all sounds great, but as we know from many other technology projects, it’s very different to focusing on the development of services for individuals, the user experience, and their unique needs and values. Beyond that, privacy and data protection are now critical issues in these kinds of huge data projects. At worst, smart city infrastructure resembles a real ‘big brother’ scenario.

It is possible to build smart cities that serve individuals better, but it would require parties to develop services from a consumer’s perspective. The concept could help people get better services, optimize their movements, live healthier lives, save time and money and improve the quality of life in many ways. Ten years ago, we had to rely on mobile app developers to provide useful apps to individuals because carriers and network vendors were not able or motivated to do it.

Many services would also become more valuable if we were able to combine personal and public data. Your movements combined with traffic and public transportation data, air quality data with your daily walking and running routes, and your personal habits with daily energy consumption peaks are just some examples. Together, the two data sources could create value for the individual and society.

This could be achieved if individuals had access to public data combined with their own personal data. In this way, privacy could be respected and preserved. But if public services start to surveil individual people, we immediately encounter data protection and privacy risks. It would also lead to a model that cities, authorities and service providers would plan what they think is suitable for individuals, not to offer tools for individuals to improve their own lives.

For city authorities, infrastructure vendors and carriers that dominate projects, it is not easy or conducive for them to build systems from an individual’s point of view. Of course, politicians in the city councils should be thinking of the residents they represent, but it’s not enough. We also need technology solutions and vendors that focus on building solutions and services for individuals.

This would likely involve an additional layer for the services. Maybe something similar to app stores made for mobile apps that also enable users to protect their privacy and manage their personal data. It could also empower many other parties to develop services for residents and give them the power to decide what services they want to use. The best services are hardly ever developed by authorities and big tech companies deciding on what the individual wants.

Smart cities should be focused more on the needs of residents. There are many ‘nice’ and ambitious plans to make cities and the lives of residents better, but nice plans are never enough. The real questions are who are the actual customers, who can decide which services to use and who will control the data. To make these services beneficial for people, the concepts, technology, architecture, data and business models should be designed to empower people, not just to surveil and control them.

9/10/2021

Comments

Non-fungible tokens (NFTs), have gathered a lot of interest recently. They certify digital assets, including millions of dollars of digital art pieces. Christie’s has already sold an NFT work of art by Beeple. Of course, it raises the question, is this something more concrete than Initial Coin Offerings or ICOs in 2017.

NFTs are digital certificates on a digital ledger, or blockchain, that proves a digital asset to be unique and therefore not interchangeable. NFTs are used to represent and certify photos, videos, audio and other types of digital files. Art is currently getting all the publicity, but NFTs can certify many other items, including text, software code or even Twitter tweets. The fundamental idea is that a digital object can be tokenized, and it becomes unique in that way. It is, in principle, not possible to make further copies of it.

Some people have commented that the value and irony of NFTs is that although their name is non-fungible, they are easily fungible. They can be unique, but it is easy to trade them. And as we know, things can have value and liquidity if there is enough demand and supply and transactions costs are low enough.

Many of us can still remember the 2017 ICO boom when companies started to offer their own tokens. Typically, they were startups (or not even startups but startup ideas) with business plans (called white papers). They included a token as an important component of their business plans and then started to sell those tokens. Some projects were able to collect significant money and, in rare cases, built a long term business. Many people participated in ICOs to learn how to buy a token, not thinking of its ROI. Some people had many bitcoins, and some had a hard time selling them (because they had no idea how they had acquired them) and wanted to diversify to other tokens.

A fundamental difference between NFTs and ICOs is that ICO tokens usually represent only some future promises. NFTs represent assets, especially digital assets. In that way, buyers can evaluate how they see the value of their assets. It is always complex to evaluate the value of art, and NFT art has precisely the same challenges. Then there are many other digital items like pieces of music, virtual items in games and software components that can have an NFT.

There are also plans to expand the NFT concept from virtual and digital items. There could also be digital certificates to represent physical items, for example, a certificate to prove real estate ownership. This part requires a legal framework that enables the use of this kind of digital certificate.

NFTs have also generated crowdfunding plans. People and companies could sell fractions of their work, for example, music, movies or software. NFTs can make this market more effective, but it doesn’t remove all crowdfunding challenges, especially how to find the correct value and then make the secondary market liquid. It is also good to remember the model can work for some items that have enough supply and demand, but it doesn’t mean NFTs alone guarantee them for any item.

There are several new business plan ideas based on NFTs. For example, if software is published as an NFT, there could be a new GitHub, especially for NFT software. Companies and individuals could start to license data as NFT packages, and media companies could also offer NFT content.

Ethereum, which is based on the proof-of-stake model, is the most commonly used solution for NFTs. Blockchain still has fundamental questions around which solutions have a long term future and value. When blockchain software is updated, and a fork created, backward compatibility is an important question. A soft fork means a new version is backward compatible, and a hard fork means it is not. If a new version is not backward compatible, then old tokens won’t work in the new system. In the end, it is the community of each token that can decide which updates and forks take place. The fundamental question for each blockchain solution is its future backward compatibility. At the moment, Ethereum looks like a safe bet to implement blockchain-based solutions. With lesser-known blockchains, it is harder to predict their future.

The NFT concept is more concrete and makes it easier to evaluate items than ICOs did. But in the end, an NFT’s value depends on the underlying items, so it is impossible to say if an NFT as such represents something valuable or only empty promises. NFT is an excellent model to manage and trade the value of digital items. But it is crucial to remember that an NFT alone doesn’t create value for a digital item. The items must have value, and NFTs help to make the value tangible.

The article first appeared on Disruptive.Asia.

Picture

Crypto ATM at Crypto Valley in Zug, Switzerland.

9/2/2021

Comments

The COVID-19 pandemic has been significant for wearable devices. They have helped to detect early COVID-19 symptoms, and they have also helped people live healthier lives and take care of their wellbeing during the pandemic. The last 18 months have been a good time for many digital services, from video conferences to food delivery apps. Maybe it will permanently change how we manage our wellness and health and help mobile healthcare become mainstream.

Higher resting heart rate and body temperature are early signs of COVID. For example, research institutes and universities have developed software to use Oura ring data to detect these early symptoms. Employers have also bought wearable devices for employees to detect early symptoms and warn them not to work if there are warning signs. This is the case in companies from customer and health care services to professional sports teams.

Wearable manufacturers have reported that, based on their data, the COVID situation has also helped some people sleep better. The reason might be that people don’t need to hurry to work and take the kids to school in the morning. But we have also seen, as the situation continues, more people feel stress, i.e. based on data, have a higher heart rate (HR) and don’t sleep as well.

The situation has also changed exercising habits. People don’t walk to work or take public transportation, no daily breaks to go for lunch or coffee. Health professionals are worried people are sitting too much during the pandemic. Others have started to exercise more, not daily walks but daily runs. This has resulted in more sports injuries.

All this has prompted people to monitor their daily wellness and health data. People have also hesitated to see a doctor or go to the hospital but monitored their health with a smartwatch to measure heart rate or ECG (electrocardiogram). And if you have a Zoom call with your doctor, it is actually useful that you have that data at hand (so to speak). 

This all demonstrates that people have started to use more of these devices and are getting more data, but it’s not that simple. What should I interpret from my heart rate or heart rate variability? Do I exercise too little or too much? Is my sleep quality and exercising linked to each other? What is the data combination that really indicates some illness?

When people get more data, it doesn’t mean they suddenly become health, sleep, diet and wellness experts. Some people might feel so when they Google health care instructions, but it can make things worse. This data can be beneficial for health and wellness monitoring, but it needs better software to analyze it or make it available for professionals. 

Mobile healthcare has been a hot topic for years, but COVID time has really brought it to the fore. Healthcare organizations tend to be rather conservative in taking on new things, but this period has forced them to find new solutions quickly. I know many mobile healthcare startups that have struggled for years. One big problem has been that healthcare organizations move slowly, making them difficult customers for agile startups. The other problem is getting access to reliable and accurate data. Many of those companies have offered solutions to transfer data to a doctor or hospital, but often people have to capture the data themselves, e.g. measure their glucose, blood pressure, heart rate and enter it into an app. Some people find this challenging, and others are just too lazy to do it.  And there are those who might want to ‘fix’ their own numbers to either avoid embarrassment or show off.

So, now we have more data, and we have solutions to transfer the data. But we still have a couple of problems: 1) privacy and data security for sensitive wellness data, and 2) more systematic models to utilize data, not only from one but several wearable devices. This means we need solutions to collect and combine data from several devices, combine that data and at the same time protect privacy. It would also help if this data could be combined in the future with other health care data like health history.

With new technology and concepts, it typically takes years to make the breakthrough. It often also needs some special triggers to get things to happen. I remember the first great mobile health tech visions 20 years ago with 3G hype. Now it looks like the pandemic has helped us over some major obstacles, and the wearable market has also developed rapidly. We should now see rapid and significant development with more applications and services using wellness data more effectively, with a subsequent boost to mobile healthcare.

The article first appeared on Disruptive.Asia.

Bitcoin And The Monetary Chakras

The feelings that arise in people, as described by the Chakras, are determined by the type of money they use.

Why Is Cyber Assessment So Important in Security?

All the pen testing and tabletop exercises in the world won't help unless an organization has a complete and accurate understanding of its assets.

Practical Ways to Improve Your Sleep as an Esports Athlete

Practical Ways to Improve Your Sleep as an Esports Athlete Introduction In my last blog we discussed three reasons why sleep is important for your esports performance. If you haven’t read that yet I’d definitely recommend giving it a read at some point to help you understand why sleep is important. (Click here) In this […]

Latest Intelligence

spot_img
spot_img
spot_img

Chat with us

Hi there! How can I help you?