By: Michelle McNickle
Telehealth services offer substantial opportunities for healthcare cost savings, as well as a proven effectiveness with improving patient care, particularly in rural areas. However, to get the most bang for the buck, there is still much work that needs to be done.
“With the widespread adoption of EMRs, digital health records provide physicians/clinicians with the remote monitoring capabilities to communicate with their patients,” said Fred Pennic, founder of HIT Consultant and senior advisor at Aspen Advisors.
This remote access to care saves time and money by allowing physicians to work with more patients and by cutting out travel expenses for people in rural areas — many of whom find travel to be a financial and physical hardship.
Pennic believes there are some key endeavors that need to take place for the full positive effects of telehealth medicine to be felt. He offers this food for thought:
1. Establish an incentive-based program. According to Pennic, sustainable funding is vital to the successful, widespread adoption of telehealth. “Creating more incentive-based programs or grants will provide agencies and other organizations with the funding necessary to overcome the start-up costs associated with implementing such initiatives,” he said. Recent research has proven the potential cost savings of such initiatives can be substantial, making the case for incentive-based programs to get telehealth initiatives up and running that much stronger. For example, after evaluating a telehealth program, researchers at Stanford University, found spending reductions of approximately 7.7 percent to 13.3 percent, or $312 to $542 per person per quarter.
2. Develop the infrastructure. “Having adequate infrastructures [in place] to support these initiatives are imperative,” said Pennic. Infrastructure is the “heart of telehealth,” he said, and includes investing in equipment such as fiber optics, broadband/wireless coverage, video, computer, voice and imaging.
3. Improve telehealth reimbursements. As it stands legislatively, said Pennic, there’s no universal reimbursement policy among public and private sectors governing the reimbursement of telehealth services — something he believes is imperative to its widespread adoption and success. “Current payment for telemedicine services, such as offsite reading of medical images, includes Medicaid, Medicare, employers and private insurers,” he said. “However, payment is limited for interactive consultations and chronic-care patients.”
4. Foster user acceptance and confidence in telehealth. “Perhaps the greatest challenge in telehealth is increasing the user acceptance of technology, for both clinicians and patients who aren’t tech savvy,” said Pennic. Ideally, he said, successful telehealth programs must be able to easily integrate the telehealth process into healthcare and patient environments seamlessly. And although we know the federal Medicare program for seniors and disabled Americans doesn’t currently reimburse for telehealth and home monitoring services, a recent article is saying that could quickly change due to the upswing and acceptance of telehealth programs In fact, according to Dr. Joseph Kvedar, director of the Center for Connected Health at Partners Healthcare in Boston, the future is “quite bright” for payment and reimbursement programs. Statics have proven telehealth’s effectiveness, the article states, with confidence in its ability to reduce readmission rates growing.
5. Allocate resources and time. In addition to meeting technology requirements, said Pennic, successful telehealth programs must have the proper allocated resources and time necessary to ensure its widespread adoption. “People and processes are the key components to effective telehealth utilization,” he said. Agreeing with Pennic is Laurence C. Baker, PhD, a professor of health research and policy at Stanford. After studying a Healthy Buddy telehealth program, which was used by Medicare patients in the Northwest, he found two main aspects played most into its success: the first being the “tight” integration of information and care management, and the second was the device itself, which was patient-friendly and easy to use.
By: Brian T. Horowitz
In his CES keynote, Qualcomm CEO Paul Jacobs announced a $10 million X Prize for a team that can design a health device modeled after the "Tricorder" gadget from "Star Trek."
At the 2012 International Consumer Electronics Show in Las Vegas, Qualcomm and the nonprofit X Prize Foundation announced the Tricorder X Prize contest to award $10 million to a team that can develop a handheld device to diagnose a patient's health.
The Qualcomm Foundation, the chip maker's philanthropic arm established in 2010, is working with X Prize on the competition. The X Prize Foundation is a nonprofit organization that runs competitions to stimulate research and development.
Qualcomm and X Prize have modeled the contest after the "Tricorder" scanning device, familiar to fans of the "Star Trek" television series and movies. Dr. Leonard "Bones" McCoy and Spock, the Vulcan science officer, frequently used the Tricorder on the show, Qualcomm CEO Dr. Paul Jacobs noted in his keynote.
The "Star Trek" stories introduced different types of Tricorders, including models for medical scanning, and others for scanning alien environments for life forms or a variety of geological or atmospheric data.
"Health care today certainly falls far short of the vision portrayed in 'Star Trek,'" said Jacobs. "This competition will accelerate the development of tools that can empower consumers to take charge of their own bodies and manage their own care."
X Prize has conducted similar contests in education, global development, aerospace, energy and environment. It awarded $30 million for the Google Lunar X Prize in education and $10 million for the Archon Genomics X contest for genomic sequencing sponsored by drug benefit manager Medco Health Solutions.
Dr. Peter H. Diamandis, chairman and CEO of the X Prize Foundation, announced the Tricorder competition with Jacobs during the Qualcomm leader's CES keynote Jan. 10.
In his keynote, Jacobs also mentioned Qualcomm's new 2net cloud platform, which will deliver medical data from patients to caregivers. He also highlighted the company's Snapdragon chips for Android and Windows 8.
Teams developing devices in the competition will incorporate data from wireless sensors, imaging technologies and artificial intelligence into an "easy-to-use" handheld device, said Diamandis.
The Tricorder can be brought to life if all of these technologies are "seamlessly" integrated into a device that's easy for consumers to use, he said.
"We are looking to drive an extraordinary set of breakthroughs in health care," Diamandis said during Jacobs' keynote.
The winning team will need to develop a mobile platform that accurately diagnoses 15 diseases across 30 consumers in three days without a physician. The platform must also be able to capture vital data, such as blood pressure, respiratory rate and temperature.
By sponsoring the competition, Qualcomm aims to motivate entrepreneurs, engineers, scientists and doctors to create wireless health services and technologies that increase access to health care and make the health care system more efficient, said Jacobs.
"We're really working hard to develop new wireless tools, devices, sensors and services that are helping people interact with their health care providers and manage their own wellness," the Qualcomm CEO said. "This is making health care more accessible and more affordable."
In his keynote, Jacobs also introduced Dr. Eric Topol, chief academic officer for Scripps Health, who demonstrated medical monitoring technologies for smartphones. Scripps Health is a nonprofit health system in San Diego.
Topol showed apps that displayed cardiogram waves and blood glucose readings on his Sony Ericsson Xperia smartphone. With personalized medicine a trend to watch in 2012, Topol also demonstrated a sensor that can take a saliva sample that uses DNA sequencing to tell whether medication might work for an individual patient or not, or whether side effects might occur.
X Prize's Diamandis took inspiration from Topol's demonstration.
"Our goal is to take the technology you saw Dr. Eric Topol demonstrate here light-years forward and really to bring the Tricorder technology of 'Star Trek' to life," said Diamandis.
By: Scott Nishimura
Mary Mentesana sees the stats: Texas continues to generate jobs, far more than any other state.
But she hasn't gotten an offer.
She figures she has applied for 75 jobs since losing her post as an office administrator for a major investment firm in March. She's focusing now on jobs for assistants that pay half the $50,000 she was making. Her unemployment benefits run out in 13 weeks, and she says she has no savings after caring for her parents for 10 years.
"I get up, I look on the Internet, I go to indeed.com, I network through my friends, I volunteer at Bass Hall, I volunteer at church," said Mentesana, 49, who lives in an apartment in Keller. "I'm just keeping busy doing the things I know to do."
Mentesana, who spent two days before Christmas trolling for leads at Workforce Solutions for Tarrant County, knows she isn't alone.
For her and other job-seekers, 2012 may be just as tough. Texas added 226,000 jobs in 2011 through November, up 2.2 percent. But it has also added residents, increasing competition in the job market and leaving the state's unemployment rate at a relatively high 8.1 percent in November.
The Federal Reserve Bank of Dallas is forecasting slower job growth next year, of 1.5 to 2 percent, citing reduced government hiring and a projected slowdown in exports because of financial problems in Europe.
"If we grow at this year's pace, that would be a good outcome," said Pia Orrenius, a Dallas Fed senior economist.
Where the jobs are
But even in this uncertain economy, pockets of the job market remain vibrant. Economists and other experts point to several sectors that are projected to generate jobs over the next several years.
Financial services, healthcare, information technology, management, teaching and energy are high on the Texas Workforce Commission's list of growth sectors for Tarrant County through 2018.
"There's a lot of entry-level jobs, as well as higher-paying, more professional and technical jobs," said Jann Miles, strategic planning unit director at Workforce Solutions for Tarrant County. "We have a pretty balanced economy."
Engineers -- aerospace, petroleum and software among them -- continue to be in demand.
Information technology is "strong again," Miles said. "People who can develop apps are being hired straight out of college."
Healthcare, driven by an aging populace, will continue to spin off jobs ranging from medical assistants to doctors, Miles said.
Maturing Barnett Shale production has meant more administrative and managerial jobs related to the oil and gas industry, Miles said.
Teachers will be in demand, given Texas' population growth, she said.
"It just makes sense when you consider how many people are moving here," Miles said. "The problem is the economics haven't been worked out."
Financial advisers, the fastest-growing U.S. job category, will continue to be in high demand, the commission projects.
"The financial sector has a certain amount of volatility to it, but everybody is trying to figure out what to do with their money," Miles said.
With the region a major hub, logistics spawns warehouse, truck-driving and other jobs. The retail and food service sectors are generating entry-level jobs, Miles said.
Manufacturing's long-term outlook is boosted by foreign trade, the auto sector, expansion at the Port of Houston, the energy sector and highway construction, said Nathaniel Karp, chief economist for BBVA Compass, which has branches in Texas and elsewhere in the South.
Karp expects Texas to continue generating 15,000 to 20,000 jobs per month, a little less than in 2011 and well below pre-recession peaks of 27,000 in 2006 and 2007.
"If we keep that pace, that's pretty solid," he said.
He cites Texas' numerous built-in strengths to keep it ahead of the U.S. overall.
"We have a very solid base on the export side; our major trading partners are doing relatively well," he said. "It's one of the few states in the nation where employment in the manufacturing sector is growing. Texas continues to attract people from other states."
One example: General Electric will add more than 600 jobs in far north Fort Worth next year when it opens factories to build locomotives and mining equipment.
The state's hospitality industry continues to grow, he said, and professional and business services are doing well. Construction "is obviously suffering, but I think the worst is over," he said.
And the outlook has brightened for new college grads.
Employers continue to increase their hiring projections, the National Association of Colleges and Employers found in an annual fall survey.
Responding employers said they expect to hire 9.5 percent more new grads in 2011-12, and more than half plan to raise their number of hires.
Among employers that plan to boost hiring, more than half indicated that their companies have more business or are growing.
Oil and gas extraction firms topped the survey, with employers expecting 19.4 percent more hiring.
Utilities, construction, chemical manufacturing, and computer and electronics manufacturing rounded out the top five.
NACE follows up with a spring survey that generally reflects actual hiring.
"Nine percent is OK right now, but a more robust economy would push it up to 12 to 15 percent," said Ed Koc, the association's research director.
Area job-networking groups continue to encourage members -- especially older ones -- to broaden their skills.
A big bright spot at the Southlake Focus Group, Tarrant County's most prominent group, came this fall when a recruiter for Carlisle & Gallagher, a firm doing loan paperwork reviews for banks, announced at a meeting that it was hiring 250 contract analysts in Dallas.
The company hired at least 20 Southlake members.
About a quarter of the Southlake group's landings this year were for contract or temporary work.
"As a futurist, I believe in the prediction that in 10 years, half of all American workers will be independent," said Doug Anderson, a member of the Southlake group's leadership team and a consultant who recently launched a firm called the Solopreneur Center.
Scott Nishimura, 817-390-7808
By: Shahid Shah
As most of my regular readers know, I work as a technology strategy advisor for several different government agencies; in that role I get to spend quality time with folks from NIST (the National Institute of Standards and Technology), what I consider one of the government’s most prominent think tanks. They’re doing yeoman’s work trying to get the massive federal government’s different agencies working in common directions and the technology folks I’ve met seem cognizant of the influence (good and bad) they have; they seem to try to wield that power as carefully as they know how. Since most of you are in the technology industry, albeit specific to healthcare, I recommend that you learn more about NIST and the role it plays ’ they can make your life easier because of the coordination and consensus building work they do for us all. I, for one, was thrilled when NIST was picked as the governing body for the MU certification criteria. These guys know what they’re doing and I wish they got more involved in driving healthcare standards.
A few years ago NIST came up with the first drafts of the seminal definitions of Cloud Computing; they ended up setting the stage for communicating complex technical concepts and helping making ’Cloud’ a household name. After 15 drafts, the 16th and final definition was published as The NIST Definition of Cloud Computing (NIST Special Publication 800-145) in September. It’s worth reading because it’s only a few pages and is understandable by the layperson. No computer science degree is required.
Yesterday I was speaking to a senior executive in the EHR space and we had a great discussion on what healthcare providers are doing in terms of cloud computing and how to communicate these ideas to small practices as well as hospitals. It reminded me of the numerous similar conversations I’ve had with other senior executives we serve in the medical devices and other regulated IT sectors. In almost every conversation I can remember about this topic over the past couple of years, I had to remind people that NIST has already done the hard work and that we can, indeed, rely on them. Most of the time the senior executive was unaware of where the definitions came from so I figured I’d put together this quick advisory.
My strong recommendation to all senior healthcare executives is that we not come up with our own definitions for cloud components ’ instead, when communicating anything about the cloud we should instruct our customers about NIST’s definition and then tie our product offerings to those definitions. The essential characteristics, deployment models, and service models have already been established and we should use them. When we do that, customers know that we’re not trying to confuse them and that they have an independent way of verifying our cloud offerings as real or vapor.
Below I have copied/pasted from NIST 800-145 their key definitions. Imagine how many debates you would avert with technicians at clients when, during conversations with a client, you communicated some of the following information first, showed them how it was a ’standard definition’ and handed them a copy of the publication, and then mapped your offerings and discussions to the different areas. Your sales teams and the marketing teams would appreciate the clarity, too.
Note that you do not need to map every offering you have to every definition ’ just start mapping the obvious ones and then figure out how you can communicate the ’gaps’ as being not applicable to your products / services or if those gaps will be filled in the future as part of your roadmap. Treat these definitions as canonical but not inclusive ’ meaning that just because your SaaS offering doesn’t fit every essential characteristic doesn’t mean that you’re not ’cloud’ ’ it just means partially cloud.
If you’ve got questions about how to map your product offerings, drop me some comments and I’ll assist as best as I can.
Here are the key definitions from NIST 800-145, copied directly from the original source:
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.
On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.
Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).
Resource pooling. The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter). Examples of resources include storage, processing, memory, and network bandwidth.
Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be appropriated in any quantity at any time.
Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability1 at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.
Software as a Service (SaaS). The capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure2. The applications are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or a program interface. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
Platform as a Service (PaaS). The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages, libraries, services, and tools supported by the provider.3 The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the deployed applications and possibly configuration settings for the application-hosting environment.
Infrastructure as a Service (IaaS). The capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components (e.g., host firewalls).
Private cloud. The cloud infrastructure is provisioned for exclusive use by a single organization comprising multiple consumers (e.g., business units). It may be owned, managed, and operated by the organization, a third party, or some combination of them, and it may exist on or off premises.
Community cloud. The cloud infrastructure is provisioned for exclusive use by a specific community of consumers from organizations that have shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be owned, managed, and operated by one or more of the organizations in the community, a third party, or some combination of them, and it may exist on or off premises.
Public cloud. The cloud infrastructure is provisioned for open use by the general public. It may be owned, managed, and operated by a business, academic, or government organization, or some combination of them. It exists on the premises of the cloud provider.
Hybrid cloud. The cloud infrastructure is a composition of two or more distinct cloud infrastructures (private, community, or public) that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load balancing between clouds).