Thursday, December 17, 2009

Wireless Telehealth Needs Standards and Inter-operability

I am providing the link to an article in MobiHealthNews with little commentary.  The article can be found at:  My one comment is that it appears that his objectives for tele-medicine are similar to my own: provide medical care that keeps patients out of hospital and nursing homes. 

The article is an interview with George MacGinnis who is with the Assistive Technology Programme at the NHS Connecting for Health in the UK.  He was interviewed by MobiHealthNews at the Mobile Healthcare Industry Summit in London.  I think it is well worth taking the time to read this interview.  In addition, MobiHealthNews has included a video of the interview.

Tuesday, December 15, 2009

Revamping the Revenue Generation Model in the Medical Device Industry

My fourth posting on this blog on 29 September 2009 was part of a multi-part examination of Medtronic's remote programming patent (US Patent # 7,565,197 that was granted in 21 July 2009).  I suggested that the patent patent implied two directions in the development of medical devices:
  1. The development of a single, common hardware platform based on a generalized processor, similar to TI's low power processor. (Add urls).
  2. Medtronic device capabilities would be defined primarily by software.  Furthermore, the patent defines a capability for software to be downloaded to a device, thus defining the capability for updating the software on the device.
We've learned that there are technologies in development that could significantly increase the battery life of devices: maybe at some point eliminating the need for battery replacement all together.

Today, physicians, hospitals and device manufacturers receive the bulk of their payment when a device is implanted or replaced.  Thus, the current business model of device manufacturers relies on primarily on product such as an ICD or CRT and leads.

However, the Medtronic patent suggests the possibility, maybe even the likelihood of strategic shift from a product to a licensing business model. This would suggest a business similar to software companies who charge a flat or yearly fee for the use of software.  Instead of a replacement, the patient receives a software upgrade and the device company receives payment for the software upgrade.  This is one step removed from a pure product to a service-oriented model, but it still treats the software as a product.  Nevertheless, it provides flexibility to the medical device company in that revenue comes less tied to the sale of objects, and more tied to the services provided to the customer.

An even more innovative approach and more in-line with a service-oriented business model would be to have the software redefine the capabilities of the device itself while implanted in the patient.  For example, upgrade an ICD to a CRT-D by changing software.  I do not know the technical, implantation or leads-related issues of doing this, however, from a software standpoint, there should be nothing stopping a device manufacturer who has taken the common hardware design approach.

A pure service-oriented model would change on the basis for the services provided.  Since I'm a technologist and not an MBA who has worked in the device industry for decades, I cannot define all the possible revenue-producing services medical devices with remote monitoring and remote programming could provide device companies.  I can say that the services that medical device companies can provide medical care providers and their patients is becoming less and less tied to the devices themselves. So a more service-oriented perspective in the medical device industry seems warranted.  

It seems apparent that for medical device companies to expand their services and patient-care and management capabilities with information-based services over the communications infrastructure, they are going to have to change the way they receive revenue.  The current business model and means of generating revenue does not provide incentives to companies to expand into information based services given the current product-based revenue model currently in use.  I suspect that in a relatively short time, Medtronic will propose a new revenue model.  I shall be watching for the signs.

Sunday, December 13, 2009

Essay: Economical Medicine

To my readers:  I have been engaged in high-priority activities for my current client and have unfortunately neglected this blog.  I plan on publishing a flurry of articles from now to the end of this year.  Furthermore, I am re-initiating my review of patents and patent applications.


In this essay I discuss some of my observations regarding the US medical system.  I discuss what I consider could be the impact of remote monitoring technology on US medical practice.

I hope that people outside of the United States read my blog.  I provide my perspective as one US citizen about the US culture and medical practice.  I hope that others may chime in, and provide their perspectives regarding the US medical system, their own medical systems (if citizens of another country) and provided me with their perspectives.  

I argue that remote monitoring can provide high quality health care at a lower cost.  Remote monitoring provides lower cost health care primarily by keeping people out of the hospitals.  As a result, the huge infrastructure devoted to hospital will likely whither.  Hospitals will always have a place, but they'll become smaller and targeted to providing critical services such as trauma care, critical care and post-operative recovery.  People will spend less time in the hospitals, but physicians and automated care-givers will be able to monitor patients where ever they are located - mostly, away from the hospital.

But before I discuss my views on remote monitoring and it's place in economical medicine, I discuss my concepts of economical medicine.

Economical Medicine

My home is Chicago, Illinois, and over the last few years, I have seen a spat of new hospital construction.  Admittedly, there are areas where there are too few hospital beds and services.  I have been astonished by the amount of recent construction. It seems that the hospital are competing with each other to see who can provide the newest, most up to date hospital.  Furthermore, many of these same hospitals purchase the most expensive scanning equipment available and build large testing laboratories.

The United States provides some of the worst and the best medical treatment available in the industrialized world.  If you want something extra-ordinary performed, come to the US.  Where the US fails is providing mundane care to the majority of its populace. Our outcomes for the extraordinary are fabled, but the US ranks 37 in the WHO health care rankings behind countries such as Costa Rica, Columbia, Dominica, besides the obvious ones such as France, Switzerland, Austria, Italy, etc.

A landmark study published in 2000 showed that the US has the most expensive health care system in the world based on per capita and total expenditures as a percentage of gross domestic product.  

In 1998 the US spent $4,178 per person on health care in 1998.  The study median was $1,783 and the closest competitor was Switzerland at $2,794.  US spending as a percentage of gross national product was 13.6 percent. The countries closest were Germany (10.6%) and Switzerland (10.4%).  And things since 1998 have only gotten more expensive in the US to the point where the US care costs have reached crisis proportions.

Yet in the midst of an attempt to repair the US crisis, members of the US Congress, including nearly every member of the Republican party, have demonized any attempt to make the cost of health care more reasonable.  Particularly, when the costs of US health continued to increase at a pace that would eventually drastically lower the standard of living of the majority of Americans. Why is this?  

The roots of the opposition are clearly political and rooted in the economic interests of primarily the US health insurance companies.  Health insurance companies nearly own and operate many members of the House and Senate on matters of health care.  And these health care companies decided to declare war against any and especially a strong public insurance option - e. g., anything close to Medicare for the rest of us.  However, there is cultural resistance as well.

Culturally, Americans are profligate. We are a non-economical culture and that believes itself to have no limits.  Our sense of limitlessness is our greatest strength and weakness, and it has been running out of control for a long time. 

Americans build roads and cars instead of building trains and tracks.  We built muscle cars with large and powerful engines for decades instead of fuel-saving vehicles.  We built suburbs along our superhighways and commute long distances to work in vehicles that consume excessive amount of fuel.  We built large houses and houses with little insulation that consume excessive amounts of fuel to heat and excessive amounts of electricity to cool and light.  Growing up in this culture, my sense is that many Americans construe excess with the good life.  That need not be case.

We have a medical system that costs too much, delivers too little, places undue burdens on it's practitioners such as malpractice insurance costs and excessive paperwork.  In addition it has been perceived by a wide variety of players as a way to make massive, excessive amounts of money.  Getting fairly paid for a medical service, product or drug is a good thing.  Excessive payments can corrupt or bankrupt an entire system. 

Lower Cost Does Not Necessarily Equal Lower Quality

Over the past several decades we have be privy to a revolution, a revolution in ubiquity of computer power.  Compare the cost of an 1984 Apple Macintosh or a 1984 PC with one today.  The costs are either comparable or lower, but the computational power has skyrocketed from then to now.  Everything in the computational and communications sphere has increased while the cost has decreased.  Supercomputers and supercomputer availability, rare in the 1980s and early 1990s, has exploded in the last decade.  Sophisticated hand held computers with voice and data capabilities that dwarf the powers of 1990's desktop computer are available for hundreds of dollars.

Remote Monitoring

Remote monitoring is a minor outgrowth of the computing and data communications revolution.  It makes some use of the continuing computer and telecommunications developments, but so far, relatively little.  However, the potential is there as well as the interest in spreading the capabilities of the computing and communications revolution to the medical community.  In fact, I believe that many computer scientists and engineers consider medicine one of the last frontiers to thoroughly swept-up in this revolution.

Medicine by nature is a conservative discipline.  It deals with people's lives.  In the US there's the added problem of the legal profession and malpractice insurance companies breathing down a physician's neck.

I believe that the medical industry finally fully leverages the capabilities of the computer and communications revolution, medical costs will be lowered, people will spend either no or little time in hospitals.  Physicians will have the capability of tuning the dosages of medication in real time.  Sophisticated computer systems that have made use supercomputer models will be able to determine the medical status of a patient in real time or near real time.  These systems will be able to determine if a patient is showing signs of a pending medical crisis and requires intervention before the crisis appears.

All this can be available to the citizenry at a cost that would surprise you.  This is the ounce of prevention on a grand scale.

I shall continue to discuss economical medicine in future articles and how leveraging the computational and communications revolution will contribute to providing better medical care at a lower cost.

Tuesday, December 1, 2009

Biotronik TRUST Studies: Reprinted Abstracts and Commentary

What follows are published abstracts of the Biotronik studies that provided evidence that Home Monitoring can substitute for quarterly check-ups for ICD patients.  That care of ICD patients can be just as effective with one per year in-clinic check-ups instead of the normal three month in-clinic check-ups.  This was supported primarily by 2008 study.

The 2009 study is a logical follow-up to the 2008 study. This study provided evidence that the Biotronik remote monitoring (Home Monitoring) system can provide early-warning notifications of significant cardiac events faster and more effectively than quarterly, in-clinic visits.  This study has wider implications.   It provides evidence that remote monitoring can provide the kind of care that at one time could only be provided in hospitals.  Furthermore, it demonstrates the kind of capability necessary to provide the kind of early warning that can keep specific, targeted populations out of hospital, thus providing more economical and more desirable health care.

These studies are reprinted with permission from Biotronik.  (I have no affiliation with Biotronik.) 

2008 Study

Evaluation of Efficacy and Safety of Remote Monitoring for ICD Follow-Up:

The TRUST Trial

Authors: Niraj Varma, Cleveland Clinic, Cleveland, OH; Andrew Epstein UAB Medical Center, Birmingham, AL; , Univ of Alabama Birmingham Medical Center, Birmingham, AL; Robert Schweikert Cleveland Clinic, Cleveland, OH; , Akron Medical Center, Akron, OH; Charles Love, Davis Heart and Lung Research Institute, Columbus, OH; Jay Shah, Carolina Cardiology Associates, Rock Hill, SC; Anand Irimpen; Tulane University Medical Center, New Orleans, LA

Background: Remote monitoring (RM) of ICDs may provide daily, automatic device and patient status data and cardiac event notifications. TRUST tested the hypothesis that RM was safe and effective for ICD follow-up for 1 year in a prospective, randomized controlled clinical trial.

Methods: 1282 patients were randomized 2:1 to RM or to conventional (RM disabled) groups.Follow up checks occurred at 3, 6, 9, 12 and 15 months post-implant. In the RM arm, RM was used before office visits (OVs) at 3 and 15 months. At 6, 9 and 12 months, RM only was used but followed by OVs if necessary. Conventional patients were evaluated with OVs only. Follow up was “actionable” if system reprogramming/revision or change in anti-arrhythmic therapy occurred. Scheduled and unscheduled OVs (including responses to event notifications in RM) were quantified for each individual patient per year (pt yr) of follow up. Incidence of death, strokes and surgical interventions (morbidity) was tracked in both groups. 

Results: RM and conventional patients were similar in age (63.3 ± 12.9 vs 64.1 ± 12.0 yrs, p = 0.30), gender (71.9% vs 72.4% male, p =; 0.89), pathology (LVEF 29.1 ± 10.8% vs 28.6 ± 9.8%, p = 0.47;coronary artery disease 64.5% vs 71.4%, p = 0.02), medications (Beta blockers 79.5% vs 75.9%, ACE inhibitors 42.4% vs 46.8%, ARBs 7.8% vs 9.9%, p = NS), indication (primary prevention 72.3% vs 74.2%, p = 0.50), and dual chamber implants (57.9% vs 57.0%, p = 0.76). RM reduced scheduled OVs by 54% and total OVs by 42% without affecting morbidity. Event notifications were managed using RM alone in 92% of cases. Of the remainder resulting in unscheduled OVs, 52.2% were actionable. RM improved adherence to follow-up. 

Conclusions: TRUST demonstrated that remote monitoring is safe, decreases the need for in-office visits, provides early detection of significant problems, and improves ICD surveillance without increasing unscheduled office visits. In conclusion, remote monitoring is a safe alternative to conventional care.

2009 Study


Authors: Niraj Varma, MD, FRCP, Andrew Epstein, MD, Anand Irimpen, MD, Robert Schweikert, MD, Jay Shah, MD, Lori Gibson, DVM and Charles Love, MD. Cleveland Clinic, Cleveland, OH, University of Alabama Birmingham Medical Center, Birmingham, AL, Tulane University Medical Center, New Orleans, LA, Akron Medical Center, Akron, OH, Carolina Cardiology, Rock Hill, SC, Biotronik, Inc., Lake Oswego, OR, Davis Heart & Lung Research Institute, Columbus, OH

Introduction: ICDs have extensive self-monitoring capability with diagnostic data available at interrogation. Remote Monitoring (RM) may facilitate data access but this has not been tested. The secondary endpoint of the TRUST trial tested the hypothesis that RM with automatic daily surveillance can provide rapid notification thereby facilitating prompt physician evaluation.

Methods: 1312 patients were randomized 2:1 to RM or to conventional (C) groups. Follow up checks occurred at 3, 6, 9, 12 and 15 months post-implant. RM was used before office visits (OVs) at 3 and 15 months in RM group. At 6, 9 and 12 months, RM only was used but followed by OVs if necessary. C patients were evaluated with OVs only. Unscheduled checks between these time points were tracked. The hypothesis was tested by determining time elapsed from first event occurrence in each patient to physician evaluation.

Results: RM and C patients were similar (age 63 ±13 vs 64 ±12 yrs; gender 72 vs 73% male, NYHA class II 56 vs 61%, pathology LVEF 29 ±11 vs 28 ± 10%; CAD 65 vs 72%, amiodarone 14 vs 14%, primary prevention indication 72 vs 74%, and DDD implants 58 vs 57%). Median time to evaluation was < 3 days in RM compared to < 30 days in C (p < 0.001) for all arrhythmic events (figure) including silent episodes eg AF. System (lead/ generator) problems were infrequent (20 events in RM +C).

Conclusions: Remote monitoring with automatic daily surveillance provides rapid detection and notification of both symptomatic and asymptomatic arrhythmic events, enabling early physician evaluation.


2008 Study

Of significant interest would be morbidity rate.  The remote monitoring group showed a .9% higher death rate than the conventional group.  This result was also nonsignificant.  The nonsignificant difference appears to be expected outcome.  Demonstrating a negative - or no difference - is always a concern in research because of the logical problem in demonstrating that something did not happen or that there are no difference between the groups.

I have an additional concern with respect to the unbalanced design.  Unbalanced designs have lower  power of your statistical power - that is, the ability to reject the null hypothesis - than balanced designs. And that would be of concern in study where the expected outcome is no difference.  However, the numbers are extremely large that should off-set the reduction in statistical power created by the unbalanced design.  Since I do not have the raw data, I cannot be sure.  Nevertheless, this seems reasonable. 

The remote monitoring group did have a slightly higher rate of unscheduled appointments - .6 per year in the RM group and .5 in the conventional group; and the actionable percentage was .7% higher in the RM group. The differences could be considered marginally significant with a p = .104. If I understand the circumstances correctly, it seems reasonable that remotely monitored patients would have a higher rate of unscheduled appointments.  Remote monitoring should have the capability of earlier detection of arrhythmic events.  Thus a detected cardiac event would trigger the patient's to request that the patient come to the clinic as soon as possible thus an unscheduled appointment would be registered.  One might expect remotely monitored patients would have appointments that are more demand or situation based than regular, scheduled appointments.

In spite of this difficulties of this design, the conclusions of this study seem reasonable in that the remotely monitored patients who received in-clinic check-ups once per year had similar outcomes to those who receives conventional care with four in-clinic visits per year.

A point of interest.  No comparisons were made between Biotronik and remote monitoring systems provided by other companies such as Medtronic, St. Jude Medical or Boston Scientific. I understand the difficulties and roadblocks in the attempt to assess whether the other systems would be just as effective.  However, Biotronik effectively side-stepped the issue by comparing their home monitoring system against conventional care thus avoiding comparisons with other remote monitoring systems.  Biotronik focused on effectiveness against conventional care and in this case they were successful.

2009 Study

As a study to show the effectiveness of remote monitoring, I believe this study is more effective.  First, it's a better design in that the expected outcome is to reject the null hypothesis - that is, to find a significant difference.  Second, there is a clear case made by the findings that remote monitoring leads to earlier discovery of an adverse event.  One truism in medicine, particularly when it comes to cardiac events, is the earlier the discovery, the better the outcome.  Another thing, ICD patients have been identified as a vulnerable population and rapid reports of adverse events within this population are particularly welcome.

In theory, over time patients remotely monitored should show better outcomes than those who are not.  The data in these two studies does not show that.  However, data from other studies are starting to demonstrate that remotely monitored patients are less likely to be admitted to the hospital.  This is a new area of technology and more research is required.  However, the trends are favorable for remote monitoring.

In this study, the Biotronik remote monitoring system reported arrhythmic events.  The data reported was not early warning or predictive.  The capability to collect predictive data would increase the value of remote monitoring.  Predictive data would allow the clinic (or computer system) following the patient to intervene before the adverse event occurs.  In this study, this was not the case.

Friday, November 20, 2009

Remote Monitoring Equals Healthier Patients

I know I promised an article that discussed the Biotronik studies.  However, I just came across a brief article that I wanted to share.  It's a brief description of an article that shows the introduction of remote monitoring can substantially reduce hospital admissions.  Here's the link:

This is the kind of article that provides additional, supporting evidence that demonstrates the benefits of remote monitoring: to patients and to the bottom-line of health care.  Furthermore, as I remarked in, the people I've known have wanted to stay out of hospitals.  So this should be considered a win all way around.

Thursday, November 19, 2009

Body Area Networks

This is one of the best articles I have seen recently that discusses emerging technologies and standards for Body Area Networks. It's published by ZDNet.  Here's the link: 7 things you should know about Body Area Networks (BANs).

Next article I'll discuss the two articles on the Biotronik Home Monitoring system: TRUST articles.


Tuesday, November 17, 2009

The Virtual Doctor Visit: Washington Post

I grew up around elderly people.  My parents were middle-aged when I was born, grandparents were elderly, many of my parents friends were elderly.  I cannot think of one person who said that they liked being in a hospital.  A continual fear of my parents, grandparents and my parents elderly friends was the fear of wasting way in either a hospital or nursing home.  Death was a better alternative.  Not that they wanted to die, but that they did not want to die in the confines of a hospital or nursing home.

This is an article published today (Tuesday, 17 November 2009) in the Washington Post that discusses remote monitoring as an alternative to a hospital admission.  There's a trial underway to determine if remote monitoring can provide the kind of information that physician require to keep people from being admitted to the hospital.  It's care in the home.  Here's the link: The Virtual Doctor Visit.

Here's an update on the Digital Plaster trial:

Monday, November 16, 2009

Maintaining Communication Security

Having a secure channel is particularly important for remote monitoring and remote programming.  Here's an article that was recently published regarding a company that has taken an interest approach to the problem. Here's the link: Boosting the security of implantable devices.  

I am the inventor of a data communications security technology and a founder of a security company.  (I am currently a silent partner.)  So, I have an interesting in security technology and systems.  In later articles, I'll cover some of the issues regarding maintaining communications security.

Friday, November 13, 2009

Biotronik Home Monitoring: Update

Biotronik Home Monitoring recently received the industry's first European CE Mark.  Here is the link to one of the publications that announced this: Biotronik Home Monitoring Receive Industry Approval.  The approval appears to be founded on the studies conducted by Varma that are referenced in the article.  I hope to have more information on this subject in the near future.

Thursday, November 12, 2009

Near Future: Remote Monitoring and Programming

This article will focus on a system of remote medical monitoring and remote programming as shown in the figure below.

I've discussed elements of this design in earlier posts, so I'll not go into detail about things that I have already covered.  This is particularly true with respect to the communications model wherein that involved a mobile and a central server.  The model I show in the figure is more "ready" for commercial deployment in that there multiple, redundant Central Servers in multiple locations.  This is in keeping with telecommunications philosophy for achieving near perfect connectivity through the backbone systems.

Another addition is that of WiMax (802.16 standard, for more information: WiMax Wikipedia) that is now being commercially deployed. This adds another viable data channel from which to send data.  As I mentioned before, the system that we developed was able to move traffic over one or all channels simultaneously, and traffic can be rerouted based on additional channel acquisition or loss. 

The important elements of this design for this discussion are at the ends.  Let's begin at the bottom of the diagram.  A patient could be implanted with multiple devices from multiple manufacturers.  In the diagram I show an insulin pump from Medtronic (, an ICD from St. Jude Medical ( and a pacemaker from Boston Scientific (  We could include devices from Biotronik ( as well.  The mobile server in the diagram can communicate with all the devices and address and communicate with them individually.  (We have already proven this technology.)  We would assume that the data traffic from the devices would be bidirectional and that delivery is guaranteed and secure across the connection, to and from the analysis and device servers.

Without going into substantial detail, each device has a specific and separate device managing process running on the mobile server.  Using a "plug-in" architecture, each process communicates with the multi-layered, distributed system that moves data across the network.  Each device has a continuous, virtual connection with its counterpart Analysis and Device Management server to support both remote monitoring and remote programming.

The digital plaster (or plastic strips) would generate various types of monitoring data as shown in the diagram.  A single, multi-threaded process could manage any number of strips.  

It would be conceivable for the device managing processes to subscribe to any of the digital plaster processes and send the collected data from the patients to any or all of the Analysis and Device Management Servers. The digital plaster strips could collect data from any number of locations and a variety of types of data.  This would reduce the need for building the monitoring capabilities inside of the devices and conceivably provide the kind of the data the device could never provide.  

This system is primarily software-defined and is highly flexible and extensible. Furthermore, it provides the flexibility to incorporate a wide variety of current and future monitoring systems.  I'll continue to update this model as I find more products and technologies to include.

Sunday, November 8, 2009

Remote Monitoring: Predictability

One of the most controversial subjects in measurement and analysis is the concept of predictably.  Prediction does not imply causality or a causal relationship.  It is about an earlier event or events indicating the likelihood of another event occurring.  For example, I've run simulation studies of rare events.  If any of my readers have done this, you'll notice that rare events tend to cluster around each other.  This means that if one rare event has occurred, it's likely that the same event will occur again in a relatively short time.  

Interestingly, the clustering does not seem to be an artifact of the simulation system.  There are some real-world examples.  Consider the paths of hurricanes. At any one time, it is rare that a hurricane will make landfall at a particular location.  However, once a hurricane has hit a particular location, it appears that one can predict that the likelihood of the next hurricane hitting in that same general area goes ups.  I can think of a couple of examples in recent history.  In 1996, hurricanes made landfall two times around the area of Wilmington, NC. Furthermore, a third hurricane passed by.  In 2005, New Orleans was hit solidly twice.  If you look at two hurricane seasons - 1996, 2005 - you'll note that they show quite different patterns.  The rare event paradigm suggests that when the patterns for creating rare conditions are established, they will tend to linger. 

In medicine the objective is to find an event or conditions preceding an event before the event of concern occurs.  For example, an event of concern would be a heart attack.  It is true that once one has had a heart attack, another one could soon follow.  The conditions are right for a follow-on event.  However, the objective is to prevent a heart attack - not wait for a heart attack to occur in order to deal with the next one that is likely to soon occur.  Physicians employ a variety of means to attempt to detect possible conditions that may indicate an increased likelihood of a heart attack.  For example, cholesterol levels that are out of balance might signal an increase in likelihood of having a heart attack.  

The problem is that most of the conditional indicators that physicians currently employ are weak indicators of an impending heart attack.  The indicators are suggestive.  Let me provide an example using a slot machine as an example.  Let's assume that hitting the jackpot is equivalent to an heart attack.  Each pull of the lever represents another passing day.  On it's own, with the settings that the machine is initially set to, the slot machine has a possibility of hitting a jackpot with each pull of the lever.  However, the settings on the slot machine can be biased to make it more likely to hit a jackpot.  This is what doctors search for ... the elevated conditions that make a heart attack more likely.  Making hitting a jackpot more likely does not mean that you're ever going to hit one.  It just increases the likelihood that you will hit one.  

To compound the problem, the discovery of biasing conditions that appear to increase the likelihood of events such as heart attacks are often difficult to clearly assess.  One problem is that apparent biasing indicators or biasing conditions generally don't have a clear causal relationship. They are indicators, they have a correlative relationship (that is not always strong), and not a causal relationship.  There are other problems as well.  For one, extending conclusions to an individual from data collected from a group is generally considered suspect.  Yet, that is what's going on with respect to measuring performing assessments on individuals.  Individuals are compared to norms based on data collected from large groups of individuals.  Overtime and with enough data, norms may be considered predictors.  Search out the literature.  You'll note that many times, measurement that once were considered predictive, now no longer are.

The gold standard of prediction is the discovery of predecessor event or events.  It is something that precedes the watched-for event.  In Southern California everyone is waiting for the great earthquake.  Scientists have been attempting to discover a predecessor event to that great earthquake.  Same goes for detecting a heart attack or other important medical events that are threats to ones health.  Two clear problems stand in the way of discovering a clear predecessor event.  The first is finding that event that seems to precede the event of interest.  This not easy.  A review of the literature will inform you of that.  Second, is once you've found what appears to be a predecessor event, what's its relationship to the target event, the event of interest?  Often times that is a very long process and even with effectively predictive predecessor events, the relationship is not always one to one.  In that, one predecessor event may not precede the event of interest.  Several predecessor events could precede the event of interest.  Or, the predecessor event does not always appear before the event of interest.

This ends my discussion of predictability.  Next time ... I'm going to speculate on what may be possible in the near term and how the benefits of remote monitoring and remote programming can be made available relatively inexpensively to a large number of people.

Article update notice

I have updated my article on Digital Plaster.  I have found an image of digital plaster that I have included, plus a link to one of the early news releases from the Imperial College, London, UK.  I shall include Digital Plaster in my next article.

Remote Monitoring: Update to Sensitivity and Accuracy

Before I dive into the subject of predictability (following article), I have an update on one of my previous articles: Remote Monitoring: Sensitivity and Accuracy.  It comes from a discussion I had with a colleague regarding what appeared to be counter-intuitive results.  The issue was the data sampling rate over a fix period of time.  As the sampling rate increased, accuracy decreased.  Thus with seemingly more data, accuracy went down.

Going back to the Signal Detection paradigm, the paradigm suggests that as a rule increasing the number of data points will reduce the false positives (alpha). And reducing false positives was a major objective of this research.  Frankly for a time I was flummoxed.  Suddenly I realized that I was looking at the problem incorrectly.  I realized that the problem is with the resolution or granularity of the measurement.

The Signal Detection paradigm has as a fundamental assumption the concept of a defined event or event window - and detecting whether or not within that event window a signal is present. The increased sampling rate compounded error, particularly false positive errors.  In effect, the system would take two samples, within the conditions that set-off the false positiveThus producing more than one false positive within an event window where only one false positive should have been recorded.

How to overcome the problem of oversampling, of setting the wrong size event window?  Here are some things that come to mind:
  • First, recognizing that there's an event-window problem may be the most difficult.  This particular situation suggested an event-window problem because the results were counter to expectations.  Having primarily a theoretical perspective, I am not the best one to address this issue. 
  • Finding event windows may involve a tuning or "dialing-in" process.  However it is done, it may take many samples at various sampling resolutions to determine the best or acceptable level of resolution.
  • Consider adding a waiting period once a signal has been detected.  The hope is that the waiting period will reduce the chances of making a false positive error.
On a personal note: I find it amusing that before this time, I had never encountered a granularity-related issue.  I theory I have understood it, but ever encountered it in my own research.  This was in part because the research I have performed has always had clear event boundaries.  Nevertheless, within days of writing about Sensitivity and Accuracy and the granularity issue in this blog, I encounter a granularity problem.

Tuesday, November 3, 2009

Sensor Technology: Digital Plaster and Stethoscope

Digital Plaster

Toumaz Technology has announced the clinical trials of what they are calling "digital plaster" that should enable caregivers to remotely monitor patients.  In the initial trial it would allow caregivers to remotely monitor patients when they are in the hospital.  However, conceivably patient could carry a mobile monitoring system like the one that I discussed in my article: Communication Model for Medical Devices.  

Here is a link the article on Digital Plaster:

Update:  Here's an image of digital plaster from a UK website.  This is to provide you with an image of the size and means of application of digital plaster.  It's a sensor placed into a standard plastic or cloth strip.  Simple to apply and it's disposable.  

For more information, here's the link: Imperial College, London, UK.  This is a 2007 article.  This is a good reference point to investigate the technology. 

Digital Stethoscope

Another development was the announcement at TEDMED of the digital ste.  Here's the link to the article:  This article discusses this and other new wireless medical devices that will enable patients to be remotely monitored from virtually anywhere.  Thus providing the capability to keep people out of hospitals or keep them for shorter periods of time.  Furthermore, these technologies have the capability of improving care while lowering costs.  Again I think it would be instructive to read my articles on mobile, wireless data communications:  1) Communication Model for Medical Devices and 2) New Communications Model for Medical Devices.

Sunday, November 1, 2009

Remote Monitoring: Sensitivity and Accuracy ... using wine tasting as a model

This article focuses on measurement accuracy, sensitivity and informativeness.  Sometime later I shall follow will an article that will focus on predictability.  

I discuss measurement accuracy, sensitivity and informativeness in this article in the abstract and use an example, wine tasting. However, in later articles when I drill-down into specific measurements provided by remote monitoring systems.  I shall make reference to concept foundation articles such as this one when I discuss specific measurements and measurement systems.

For remote monitoring to be a valuable tool, the measurements must be informative.  That is, they must provide something of value to the monitoring process - whether that monitoring process is an informed and well trained person such as a physician or software process.  However, there are conditions that must first be met before any measurement can be considered informative.

For any measurement to be informative, it must be accurate.  It must correctly measure whatever it was intended to measure.  For example, if the measurement system is designed to determine the existence of a particular event, then it should register that the event occurred and the number of times that it did occur.  Furthermore, it should reject or not respond when conditions dictate that the event did not occur - that is, it should not report a false positive.  This is something that I covered in detail on my article on Signal Detection.  Measurement extend beyond mere detection and to the measurement tied to a particular scale, e. g., such as the constituents in a milliliter of blood.

A constituent of accuracy is granularity.  That is, how fine is the measurement and is it fine enough to provide meaningful information.  Measurement granularity can often be a significant topic of discussion, particularly when defining similarities and differences.  For example, the world class times in swimming are to the hundredth of second.  There have been instances when the computer that sensed that two swimmers touched the end simultaneously and that the times were identical.  (I can think of a particular race in the last Olympics that involved Michael Phelps and the butterfly.)  At the resolution of the computer touch-timing system (and I believe it's down to a thousandth of a second), the system indicated that both touched simultaneously and that they had identical times.  However, is that really true?  If we take the resolution down to a nanosecond, one-billionth of a second, did they touch simultaneously?  

However, at the other end, if measurements are too granular, do they lose their meaningfulness?  This is particularly true when defining what is similar.  It can be argued that with enough granularity, every measurement will differ from all other measurements on that dimension. How do we assess similarities because assessing similarities (and differences) is vital to diagnosis and treatment.

We often make compromises when in comes to issues of granularity and similarity by categorizing.  And often times, categorization and assessments of similarities can be context-specific.  This is something that we do without thinking.  We often assess and reassess relative distances.  For example,  Los Angeles and San Diego are 121 miles from each other.  (I used Google to find this distance.)  To people living in either city, 121 miles is a long distance.  However, to someone is London, England, these two cities would seem to be nearly in the same metropolitan area.  They appear within the same geographic area from a far distance. 

Sensitivity is a topic often unto itself.  Since I discussed it at some length when I discussed Signal Detection, I shall make this discussion relatively short.  In the previous discussion, I discussed the issue related to a single detector and its ability to sense and reject.  I want to add the dimension of multiple detectors and the capability to sense based on multiple inputs.  In this case I am not discussing multiple trials to test a single detector, but multiple measures on a single trial.  Multiple measurements on different dimensions can provide greater sensitivity when combined even if the accuracy and sensitivity of each individual measurement system is less accurate and sensitive than the single measurement system.  I'll discuss this more in depth in a later article.

Informativeness ... this has to do with whether the output of the measurement process - its accuracy (granularity) and sensitivity - provides one with anything of value.  And determining the value depends on what you need that measurement to do for you.  I think my example provides a reasonable and accessible explanation.

Wine Tasting - Evaluating Wine

Over the years, people interested in wine have settled on a 1-100 scale - although, I do not know of an instance where I have seen anything less than an 80 rating.  (I am not a wine expert by any stretch of the imagination.  I know enough to discuss it, that's all.  If you're interested, here's an explanation, how ever they will want to sell you bottles of wine and some companies may block access, nevertheless, here's the link:   Independent or "other" wine raters use a similar rating system.  Wine stores all over the US often have their own wine rater who "uses" one of these scales.  In theory, you'll note that they're reasonably similar.  In practice, they can be quite different.  Two 90 ratings from different wine raters don't always mean the same thing.

So, what is a buyer to do?  Lets look at wine rating in a mechanistic way.  Each wine rater is a measuring machine who is sensitive to the various constituents of a wine and how those constituents provide an experience.  Each rating machine provides us with a single number and often a brief description of the tasting experience.  But, for most people buying wine, it's the number that's the most important - and can often lead to the greatest disappointment.  When we're disappointed, the measurement has failed us.  It lacks informativeness.

How to remedy disappointment of expectation and often times, over payment?  I think of four ways:
  1. Taste the wine yourself before you buy it.  The wine should satisfy you.  You can determine if it's worth the price.  However, I've met many who are not always satisfied with this option for a variety of reasons, ranging from they do not trust their own tastes or lack of "wine knowledge" to the knowing that they are not in a position to taste the wide variety of wines available to professional wine tasters, and thus are concerned about "missing out."  Remote monitoring provides a similar situation.  A patient being remote monitored is not in the presence of the person doing the monitoring, thus the entire experience of seeing the patient along with the measurement values is missing.  However, remote monitoring provides the capability to provide great deal of information about many patients without the need to see each individual.  The problem is, the person doing the monitoring needs to trust the measurements from remote monitoring.
  2. Find a wine rater who has tastes similar to yours.  This might take some time or you might get lucky and find someone who likes wine the way you like it.  Again, this all boils down to trust.
  3. Ask an expert at the wine store.  The hope is that the person at the store will provide you with more information, ask you about your own tastes and what you're looking for.  Although this is not experiential information, you are provided with more information on more dimensions with the ability to re-sample on the same or different dimensions (i. e., ask a question and receive an answer).  In this sense, you have an interactive measurement system.  (At this juncture, I have added by implication remote programming to mix.  Remote programming involve adjusting, tuning or testing additional remotely monitored dimensions.  In this sense, the process of remote monitoring can be dynamic, inquiry-driven.  This is a topic for later discussion.)
  4. Consolidate the ratings of multiple wine raters.  Often several wine raters have rated the same wine.  This can get fairly complicated.  In most cases not all wine raters have rated the same wine and you'll probably get a different mix of raters for each wine.  This too may involve some level of tuning based on the "hits" and "misses." 
This ends this discussion of measurement.  Measurement is the foundation of remote monitoring.  For remote monitoring what its measuring and the accuracy and sensitivity of that measurement and whether that measurement is informative is key to its value.  We've also seen a place for remote monitoring as a means for getting at interesting measurements; changing measurement from a passive to an active, didactic process.

Next time I discuss a recent development with respect to physiological measuring systems.  Here's a link to an article that I believe many will find interesting. 

Wednesday, October 28, 2009

Biotronik Home Monitoring Claim

I'm posting this article before my discussion on measurement and sensing because it has relevance to my immediately preceding posting.  

Biotronik released to the press on Tuesday 27 October 2009 an announcement regarding their Evia Pacemaker.  In that press release was some additional information regarding Biotronik's Home Monitoring system.  Here's the link to the press release:,1016041.shtml

The relevant quote from the press release is the following:

Now physicians have the choice to call in their patients to the clinic or perform remote follow-ups with complete access to all pertinent patient and device information, including high quality IEGM Online HD®. Importantly, BIOTRONIK Home Monitoring® has also received FDA and CE Mark approval for its early detection monitoring technology which allows clinicians to access their patients’ clinically relevant event data more quickly so they can make immediate therapy decisions to improve patient care. 

The indications are that the Biotronik claims that their system provides quicker access to relevant data, not that the data (and analysis) yield earlier warning results.  This is consistent with my earlier analysis and that seems to be supported by Biotronik's own admission.

I do wonder about Biotronik's long-term objective.  I suspect that Biotronik wants to be one of the big three implantable device manufacturers, not just become one of four.  It would mean that Biotronik would likely target one of the big three to replace and that would likely involve targeting the weaknesses of the company that Biotronik wants to replace.  I'll continue to monitor Biotronik and report what I find.

Next, my discussion on measurement and detection.

Sunday, October 25, 2009

Remote Monitoring: Deep Dive Introduction

I am going to change course over the next few entries to focus on remote monitoring.  This article is the first in a series of articles on Remote Monitoring and what can be gleaned from the data remote monitoring collects.  The Biotronik press releases and some of the claims they have been making have driven me to investigate and speculate on remote monitoring, its capabilities, potential and possible future. 

Two claims that Biotronik have made for it's Home Monitoring system have intrigued me.  First, Biotronik claims as a proven capability of earlier detection than other systems of critical, arrhythmic events.  Second, they also claim that they can report these events earlier than other systems.  

Let's take the second claim first, Biotronik has created a system with the capability to more quickly notify (e. g., transmit) implant data.  The Biotronik mobile capability enables a faster detection and quicker transmission of those events by virtue of its mobile capability.  Their claim is rooted in mobility of their monitor and its communication system.  So, the second claim appears plausible.

The first claim is more difficult not only because it is more difficult to prove, but because it's more difficult to define.  I think of at least two ways the capability could be defined and implemented.  One, consider the signal-detection paradigm.  I have a drawing that defines the basic signal detection paradigm below.


The basic concept of signal detection is extraordinarily simple.  On any given trial, a signal is either present or not.  It is the job of the detector to accurately determine whether or not the signal is present.  There are two right answers and two wrong answers as shown in the diagram.  The type 1 error is the indication by the detector that is signal is present when it is not.  (The probability of a type 1 error is represented by Greek letter alpha.)  The type 2 error is incorrectly indicating that a signal is not present when in fact it is.  (The probability of a type 2 error is represented by Greek letter beta.)

The objective of detector improvement is to reduce both type 1 and 2 errors.  However, often times adjustments are made to alpha or beta to make it look like there's an improvement.  For example, if sensitivity is the crucial characteristic, the engineers may be willing to sacrifice an increase in type 1 errors to reduce type 2.  (This gets into what's called receiver operating characteristics or ROC.  Something for a later blog article.)

I discuss the signal detection paradigm for two reasons.  First, the signal detection paradigm is an engineering and scientific touchstone that I'll refer to in later articles.  Second, it allows one to assess just what is accurate detection, increasing sensitivity, etc. 

Thus Biotronik's claim of earlier detection could be real or it could reflect Biotronik's acceptance of more type 1 errors in order to raise sensitivity.  This could lead to earlier detection but at the expense of increasing the likelihood of type 1 errors. In the next article, I'll explore ways to improve detection capabilities, not by increasing accuracy of a particular detector, but by increasing the number of different detectors.

Early detection could also be interpreted as predictive.  This is the more difficult than simple detection.  This would be the computed likelihood of a particular event based on one or more measurements.  This does not fit into the simple signal detection paradigm.  It often involves finding a pattern and extrapolation.  Or it could involve finding a predecessor indicator; finding a condition that is a know precondition to the target.  The specifics of a predictive capability will be discussed in a later article.  

This ends the Introduction.  The next article will discuss detection capabilities in greater detail.

Thursday, October 22, 2009

Update: Future-Market Analysis: Global Patient Monitoring

I'm posting a link to an article that provides some information from the Global Patient Monitoring Marketing study.  Here's the link: Europe Remote Patient Monitoring Market: Strategic Analysis and Opportunity Assessment.  One warning, the article is loaded with embedded ads and links to services that they want to sell you.  Go to the article, you'll see what I mean.  However, it article provides some information about to size and the growth potential for remote monitoring in Europe.

Wednesday, October 21, 2009

Verizon's Offering at the Connected Health Symposium

Article from (@Connected Health: Verizon highlights partners) briefly describing the benefits and cost savings from tele-medicine.  For example, Verizon claims that "IT healthcare solutions and services can help organizations save close to $165 billion annually, according to the carrier. The carrier also cites a report from the Insight Research Corporation that estimates $800 million per year could be saved if more treatment was shifted from physician’s offices to home health visits."

Of course, tele-medicine and the applications bring revenue to that Verizon (and other carriers), thus the cost savings amounts should be viewed sceptically.  However, in general tele-medicine solutions nearly always provide cost savings over clinic and hospital visits.  And they also provide an additional level freedom that improves quality of life.

I want to add this link that sounds a significant concern regarding the supply and demand for communications bandwidth in the near future.  Here's the link:  The title of the article is: "Are we ready for the Exabyte Tsunami?  (Here's a link to explain an exabyte:

Tuesday, October 20, 2009

Biotronik Home Monitoring Operational in Europe

I've mentioned Biotronik's Home Monitoring system in an earlier post.  One of the attractive things about the Biotronik version is that their home monitoring has been deemed a replacement for clinic visits.  Quote from the article (link immediately below)

"Designed to avoid regular visits to the clinic by patients wearing company's ICD's, CRT's, and similar devices, the system sends readings from the chest straight to your doc over the cellular phone network."

This is an interesting development because Biotronik has been taking market share from the big three medical device makers.  I think that the Biotronik capability reduce clinic visits translates into either more revenue or more free time.  Either one would be attractive for device managing physicians who may suggest to implanting physicians to choose Biotronik.  This may be a situation where a robust home monitoring system drives the choice of the brand of device to implant.  I do not have clear evidence, but I think the issue is worth investigating.

Three aspects of the Biotronik home monitoring system seem to differentiate it from others.  First, the monitoring unit is mobile and uses the GSM to communicate with the monitoring servers.  The monitoring servers in turn can notify the device managing physician or clinic with an email, SMS (text) message or fax.  Second, Biotronik home monitoring unit has what they call an intelligent traffic light system.  I haven't any information on how the intelligent traffic light system operates.  Finally, and I think most importantly, the Biotronik system has the capability of earlier detection than other systems of critical, arrhythmic events.  They claim that this is a "proven capability."  Since I have no information on the operational details or algorithms that they use, I cannot confirm or deny their claims.  

The German Government has shown its belief in the bright future of Biotronik and its Home Monitoring technology: Nominated for the German Federal President`s "Deutscher Zukunftspreis" (German Future Award): BIOTRONIK Home Monitoring for Online Monitoring of Heart Patients.

Update: 21 October 2009.  A little more information about the research the Biotronik performed with respect to the value and capabilities of their Home Monitoring system.

Biotronik Press Release Published in Reuters Regarding Home Monitoring.  This press release mentions three publications of the results of the Biotronik study.  I have not yet been able to obtain a copy.  From the outside, it's hard to assess of the significance of the technology or technologies that Biotronik has incorporated into their system. However, from the outside, it appears that with the possible exception of the mobile monitoring unit, it looks more like a publicity campaign than substance because there is nothing that I can see that clearly sets Biotronik's remote monitoring system from anyone else with respect to data collection and/or analysis.

Monday, October 19, 2009

Update on 29 September 2009 Posting

I have an update related to my 29 September posting, Medtronic Remote Programming Patent.I stated the following in that posting ...

I believe that Medtronic's patent ... reveals not only the extent of Medtronic's work on remote programming and their level of development of this technology, it reveals a product development path. ... The strategy that I believe Medtronic has taken is in keeping with long-standing trends in technology development.

Over the last several decades, the trend has been to move away from  specialized to more powerful, general-purpose processors. This enables products to be defined more by software than by hardware. Processing power has become smaller, less power hungry and cheaper, thus allowing software to become the means for defining the system's capability. Furthermore, this enables multiple products to be defined by a single hardware platform. ...

The Medtronic patent suggests a similar product strategy ... that different products will use fundamentally the same hardware architecture, but they will be defined by the software that they run. So, a pacemaker, a neurostimulator and a drug pump will share the same processor hardware platform, but their operation will be defined primarily by the software that they run. For example, take some time and examine pacemakers, ICDs. CRTs/CRT-Ds, neuro-stimulators, drug pumps, etc.  Although they have different purposes, they have enough in common to consider the possibility that all of them could share a common processor platform.

The implications are significant for all functional areas within Medtronic, from research and development, product development, software development and management, and from product support. Medtronic can leverage its enormous scale to make its scale as a company a major asset. It can substantially reduce the number of hardware platforms it supports, it can leverage its software development capabilities to have its software development groups produce software for multiple product lines, it can create more products without a substantial requirement for additional support each time a product is produced. ...

I unearthed an article published in the August 2008, Journal of Computers, titled "Design Overview Of Processor Based Implantable Pacemaker" authored by Santosh Chede and Kishore Kulat both from the Department of Electronics and Computer Science Engineering at the Visvesvarayan National Institute of Technology. (I do not have an address for you to access this article, however, if you search on the journal, the title and authors, you will find it.)

Their article describes the means by which they created a pacemaker using at Texas Instrument (TI) MSP430F1611 processor to build a pacemaker.  The TI MSP430 processor (TI MSP430 Microcontroller Website) is a general purpose RISC processor similar in architecture to the DEC PDP-11.  The TI MSP430 is designed for ultra-low power consumption and targeted to battery-powered, embedded applications.  In other words, this would be the kind of processor on which to base a line of implantable medical devices.  Having looked around the website, I noted that the application of the processor included medical devices, but not implants.  However, based on the Journal of Computers article, I can see a clear route to creating implants using this processor. (I haven't yet found a comparable processor, however, I suspect the existence of one or more.  As I find additional processors in this class, I shall make them known in this blog.)

Finally, I think the important message of the Journal of Computers article is that it is possible to use a general purpose processor and software to create a pacemaker or any other implantable medical device such as a neuro-stimulator, CRT-D, or drug-pump. As I discussed earlier, using a general purpose processor and software to create the product, can be an effective business and technical strategy.  

Sunday, October 18, 2009

New Communication Model for Medical Devices, Continued

I came across an IEEE journal article published in 2008 by two Welch Allyn research engineers.  The article has enough relevance to the topical area of my blog that I plan to devote at least one article to discuss it's contents.  The article addresses wireless communications and specifically, medical-grade wireless networks and issues surrounding their deployment.

I mention the IEEE article because having a reliable connection is a major concern to anyone who would implement a medical application that uses remote programming. Remote programming would involve downloading new instructions, new software or software patches to a device.  That download must be performed safely, securely and without error.  Furthermore, entire connection system would have to be extremely tolerant of errors and connection breaks.  In that vein, I discuss the Rosetta-Wireless connection model in more detail with special emphasis on how the model provides a reliable, logically stable and secure connection with significant throughput.  This is a continuation of the article titled "New Communications Model for Medical Devices" that I published 11 October 2009.  It is also a continuation of the article that I published on 14 October 2009," Medtronic Patent Application: Communication system for medical devices ."

I provide a slightly revised drawing of the connection model below.  (You should see a larger drawing in another window/tab if you click on the drawing). 


The area of interest of this discussion is defined by the curly-brace on the right. It is the communications path between the mobile server and the central server. As I had mentioned in my previous post the Central and Mobile Servers are logically identical; i.e., whatever data is on the Central Server will migrate to the Mobile Server and vice versa. So, if a file appears on the CS, it will be mirrored to the PS automatically. Furthermore, since the two servers are logical twins, they continue to maintain a logical connection with each other even when disconnected.

From a security standpoint, all transmissions are encrypted, all data on the mobile server is encrypted and the two system authenticate each other using a shared secret. The data on a Mobile Server is managed by a secure, centralized authority, thus if the Mobile Server is ever stolen, once that stolen Mobile Server contacts a Central Server, the Central Server will send a signal to erase it's data and terminate its operation. This is important because should anyone consider such a model as this for the transfer of medical data, the data and any device that manages that data in the field will have to be secured.

Wireless networks are inherently unreliable, they rely on radio technology with all it's physical instabilities to provide a connection.  Anything moving adds instability by continually moving in and out of coverage. The TCP/IP protocol was not designed to handle communications where there are frequent breaks and re-connections.  The protocol was designed for an “always connected” state.  Furthermore, the endpoints of the communications link – the service provider (server) or the user's equipment (client) – have been designed to “expect” an “always connected state as well.  Neither the network, appliances, user devices or users are designed to handle frequent communication breaks.

The crux of this model and the point of this article is to describe a means for transforming an intermittent and unreliable wireless connection environment into a reliable one. It does this in two ways.

First, the Mobile Server has the capability of utilizing more than one connection simultaneously (opportunistic routing).  If two or three connections are available, data can be send over all the connections simultaneously.  Should a connection suddenly drop, the system reconfigures itself and starts moving data over the remaining channels.  For example, the Mobile Server could have connections both with WiFi and 3G.  The WiFi connection could suddenly drop, but the data would be routed over the 3G connection, seamlessly and without interruption.  Thus the system is able to use connection redundancy in an effective manner and without the need to suddenly switch to an available  wireless connection once that wireless connection drops.  Since it can connect to an infinite number of wireless connections (the software is capable of doing this.  There are no limits on the number of simultaneous connections.), all the software does is move the traffic to the active connection(s). 

Second, the two servers, Mobile and Central, continually maintain a stable connection between each other.  Should their be a connection break, each server preserves the state of the data transfer (in the case of a file - a data or software file - with a known end point) or the state of the session should the transfer be a streaming connection (such as video or voice).  Should all connections drop, when the Mobile and Central Server reconnect, they authenticate each other and restart the transfer of data. 

(One aspect of the system related to its reliability is the capability of the Mobile Server to connect to a variety of Central Servers thus increasing system reliability by providing multiple connection paths to the System Service Provider Servers.)

Finally, it's time to describe the value of the model I've described to the endpoints.  The System Service Provider Servers (or Enterprise Server) is provided a stable, hardwired connection.  The applications running on the System Service Provider Servers are not required to handle any connection problems.  Adding the code to handle intermittent connections just adds to their complexity.  The engineers developing services, particularly those based on remote programming, can be assured that the transfer of data between the System Service Provider Servers and the Mobile Server is reliable and assured.

Once the necessary data or software reaches the Mobile Server, the Mobile Server connects in the usual manner to manage uploads, downloads and messages with a patient's device or devices.  (A discussion for a later article.)

Biotronik ( has just introduced a new Home Monitoring system.  Their Home Monitoring system uses a wearable device monitor/wireless communications device similar to the Mobile Server I have described.  Their Home Monitoring Device appears to be tasked only for remote monitoring, not remote programming.  Biotronik bears watching.