Home Blog

Kent State University Health Informatics Blog

Here, we explore the benefits and challenges of medical technology, and what it will take to build a highly-skilled workforce and robust medical device security.
The term “telemedicine” describes the provision of remote medical assistance via telephone or video in times when a doctor cannot meet with a patient in person. Historically, with medical care commonly regarded as a practice that benefits most from in-person treatment for full efficacy, telemedicine has generally been considered a last resort.
April 12, 2022
Innovative digital technologies in the healthcare field are enhancing every aspect of healthcare delivery, providing secure and seamless flows of information between patients and healthcare providers. When a provider collects clinical data on a patient, it is stored in electronic health records (EHRs). EHRs are created, managed and reviewed on an EHR platform designed to share information across a full spectrum of healthcare organizations.
EHR and HIPAA are related concepts, but there are important distinctions to understand. This article explores those distinctions along with the evolution of the regulatory environment.
January 03, 2022
In recent years, medical and technological advancements and the digitization of health data have transformed healthcare. Until now, however, medical professionals around the world relied on an outdated and “siloed” method to track diseases.
Kent State’s online Master of Science in Health Informatics program is a candidate for CAHIIM accreditation. Learn more about CAHIIM and what this accreditation candidacy status means for our program and for you and your career development.
The scope and importance of patient advocacy have expanded in recent years to become an integral component in many healthcare settings. Instead of being something that doctors or nurses would do for their patients when they can set aside a few minutes, patient advocacy has evolved into a career that many pursue today.
ROI, or, return on investment, is a basic calculation used by organizations or individuals to determine the value gained from a certain effort. You may be familiar with the term as something relating to the world of business more than healthcare.
The healthcare industry is evolving quickly in response to heightened expectations for improved accessibility and greater security. At the same time, the demand for health informatics specialists has grown, too, as meticulous minds are needed to gather, analyze, catalogue and qualify patient and provider data. For graduates of our online health informatics programs, the field of opportunity is vast, and considerable job growth is projected across the next decade.
With new technologies and improved business models, healthcare organizations can focus on a patient-centered approach to care. However, each improvement requires data to ensure consistent experiences and calculated actions.
The paradigm of patient care is changing. A wealth of technologies such as robotics, artificial intelligence (AI), three-dimensional printing, precision medicine, augmented and virtual reality, telemedicine and genomics can now be integrated into care delivery, leading to reduced costs and increased precision and efficiency.
Population Health is a relatively new term in the healthcare industry, and depending on who you speak to; it has different definitions. Dr. Larry Mullins, President and CEO of Samaritan Health Services, sums it up nicely in this quote: "Some would define it as determining the health of a defined group or population using health care modifiers to help make that determination. An easier answer might be just taking care of our family, friends and neighbors on a larger scale.”1
As the field of technology and our ability to gather information have grown, healthcare experts have come to depend more and more on data and technology as they treat patients. The quantity of information and the means of collecting it continue to expand, and the crucial tasks of interpreting, analyzing and implementing information in treatment plans falls to clinical informaticists.
Healthcare informatics is a burgeoning industry in the U.S. and around the world, with a projected compound annual growth rate (CAGR) of 13.8% over the next seven years. This will bring the overall industry valuation to $511.06 billion by 2027, driven by strong competition and innovation amongst health informatics companies.1
Catch up with Assistant Professor Rebecca Meehan. Her diverse interests—in gerontology and technology, among other fields—have combined to make her an expert in health information technology and a rich asset to the Kent State faculty. Here, she speaks about her background and current research, her favorite parts of teaching and life away from classes, and her encouragement for students to go beyond classes in the iSchool.
As the worldwide COVID-19 pandemic continues to affect our daily lives, the importance of protecting our families and communities through public health has likely never been greater. It’s becoming increasingly clear that the future of public health will be driven by technology, which highlights the extraordinary importance of public health informatics and its role in keeping society safe, healthy and informed.
Technology has changed almost every aspect of the way we live and work, including our approach to healthcare. Medical providers, clinical facilities and payers are increasingly adopting technology that can help them achieve the goals of higher-quality care at lower cost. Health informatics is a rapidly growing field, fusing the knowledge of technology with the desire to improve patient care. As the field grows, a competitive health informatics salary can be part of a fulfilling career.
The history of health informatics begins almost 70 years ago. In the wake of World War II, several doctors and researchers were examining the role that computers could play in helping to diagnose medical disease. They used logic and probabilistic reasoning to tackle specific healthcare problems in biology and medicine.1
Are you looking to advance in a career that puts you at the forefront of patient care, yet doesn't involve hands-on treatment? A health informatics (HI) master’s degree or postbaccalaureate certificate program may be your next step.
If you're wondering if a health informatics degree is worth it, you're probably considering a career pivot into the HI field. Or perhaps you're hoping to earn a leadership role. With the prevalence of electronic health care records, patient-generated data from wearable health devices and the rise of healthcare analytics, the need for trained HI professionals is growing. We'll explore the opportunities that this field has to offer, as well as the education and experience you'll need to rise through the ranks successfully.
Health informatics is the intersection between technology and healthcare, making health informatics an excellent career choice for anyone who enjoys analyzing systems, organizing information, and working with technology in a context that changes lives. Even though you wouldn't work directly with patients, as a health informatics professional you could have a significant impact on the healthcare industry and on peoples’ lives.
What Is Healthcare Analytics and Why Does It Matter? As the average human lifespan increases along with the global population data analytics in healthcare is poised to make a large difference in modern treatment. The use of healthcare analytics can potentially reduce the cost of treatment, predict disease outbreaks, circumvent preventable illnesses and generally improve the quality of care and life of patients.
Maybe you have spent your career working in healthcare and are interested in shifting directions to put your analytical skills to the test. Or perhaps you have a solid background in analytics and data analysis and you're looking to find the right field where you can apply your skills in a meaningful way. In either case, you should consider whether becoming a healthcare data scientist would be right for you.
Health informatics and nursing informatics utilize technology and data to create change that may lead to more efficient delivery of care, better patient care, improved health outcomes and lower costs. Master of Science in Health Informatics (MSHI) and Master of Science in Nursing with an Informatics Specialization (MSN Informatics) each teach students to gather, maintain, save, translate and distribute patient information and other healthcare data. With these similar aims, it can be easy to confuse the two practices. While there is opportunity for overlap in subject matter and career outcomes for MSHI and MSN Informatics graduates, there are many differences between these two degrees. We’ll explore each career and graduate program, but keep in mind, the main difference between health informatics and nursing informatics is how the data is used in healthcare.
Healthcare technology is evolving at a rapid pace, so we sat down with Kent State University professor, Rebecca Meehan, to discuss emerging trends in health informatics. She gave us her thoughts on everything from AI in healthcare and patient-generated health data to wearable EKGs and virtual visits.
Kent State University prepares graduates for health informatics jobs in clinical and industry settings, both independently and collaboratively on a team. Because our program prepares students with the foundational elements of informatics in healthcare, they are able to adapt quickly and apply their skills.
Big data has revolutionized industry after industry, and the demand for professionals specializing in big data has increased accordingly, with the number of data scientist positions in the U.S. growing by 650 percent between 2012 and 2018.1 Many of these positions are in fields like technology, energy or telecommunications, and those fields tend to be what people imagine when they think of big data. But big data has also been responsible for helping directly change—and even save—lives
August 05, 2019
It is possible to generate a succinct answer to the question “What is health informatics?”; in fact, the U.S. National Library of Medicine defines the term as “the interdisciplinary study of the design, development, adoption and application of IT-based innovations in healthcare services delivery, management and planning.”1 Health informatics defined in this way, however, is almost unworkably broad: Each of these three areas of the healthcare spectrum are influenced and improved by informatics practices in their own unique ways.
In the fall of 2018, Apple released its Apple Watch Series 4 models, the first consumer smartwatches that double as wearable medical devices thanks to their ability to generate an electrocardiogram test (ECG or EKG), which can inform you about your heart health. This release pushed the boundaries of wearable technology in healthcare and earned Apple a De Novo classification from the U.S. Food and Drug Administration (FDA).
Has the Internet Made Learning More Efficient? In the fall of 2016, almost 3 million people in the U.S. chose to enroll in postbaccalaureate studies, with 27.5% of them choosing to pursue their continued education through exclusively online programs.1 For those in the health informatics (HI) field, earning a degree in a digital format can be a coup, especially when the field itself is focused on utilizing the advances of modern technology to find new solutions for existing public health problems.
What Is Health Informatics? Simply put, health informatics (HI) refers to the collection, classification, storage and retrieval of information relating to healthcare delivery. The word “informatics” was coined in the 1960s by early computer programmers to differentiate between computer science concerned with information storage and retrieval and computer science primarily related to algorithms and programming. Though HI may involve a bit of both, when used professionally, it generally refers to the former: the processes and digital architecture relating to the storage and retrieval of healthcare information for the optimization of care delivery.
In December 2017, officials at the University of Virginia Health System (UVA Health) discovered an alarming problem: Their medical records had been hacked, and nearly 2,000 patients' records were exposed via physician devices that were infected with malware. The good news? They discovered the breach. The bad news? The breach had actually taken place in May 2015, meaning the hackers had access to data including patient names, addresses, diagnoses and treatments for 19 months
Exploring Artificial Intelligence in Healthcare By now, most people are comfortable with the idea that networked devices and artificial intelligence (AI) are becoming part of our everyday lives. In the United States, the smart devices market—which includes security systems, lighting and appliances—was a $40 billion industry in 2017, and the number of smart devices in American homes is forecasted to increase 70 percent over the next two years.1
In 1996, just as the digital revolution was getting underway, the Health Insurance Portability and Accountability Act (HIPAA) was passed into law. At the time, Congress recognized that advances in communications technologies would be transforming the healthcare industry, and in particular healthcare data and patient records. HIPAA was intended to protect the privacy of patient medical records and comprehensively overhaul the way healthcare data was stored, processed and transacted on a national level.