Technologies such as smartphone apps, Wi-Fi-enabled data collection devices, and web-based data management systems offer the opportunity to deliver and assess the impact of clinical trials remotely, something which researchers are becoming increasingly drawn to. However, there are drawbacks, and even experienced teams will encounter challenges. In this blog we share the lessons we learnt during the conduct of SALTS, a remote blood pressure lowering trial in Aotearoa, New Zealand, where a smartphone app was part of the intervention package. Our aim in sharing these lessons is to help other researchers considering the use of technology in research to be aware of some of the lesser-known challenges they may face.
High blood pressure (BP) is a leading cause of cardiovascular disease, the number one cause of death and disability in Aotearoa, New Zealand1. Research in other countries has shown that high BP and cardiovascular disease can be markedly reduced by lowering population sodium intake2. Dietary sodium is found in salt, and we have a problem: adults in Aotearoa consume far too much – 40% more than the World Health Organization recommends3, 4. Further, unlike more than 75 other countries in the Organisation for Co-operation and Economic Development (OECD), we don’t have a national plan to reduce it5. If we want to see a big reduction in heart attacks and strokes in our country then we must reduce the average BP across our adult population, and dietary sodium reduction is likely to be one of the best ways to do it.
In 2018 we embarked on a randomised controlled trial called SALTS (Salt ALTernatives Study), to test two promising sodium-reduction strategies: the first was the SaltSwitch smartphone application (app), a tool to help people make lower-salt food choices when shopping for food (Figure 1). The second strategy was a high-potassium, low sodium salt substitute, to use in place of regular table salt. The two strategies together were tested against a kind of ‘placebo’, simple ‘heart healthy eating’ information that is readily available to the public. The trial was run almost entirely remotely using a smartphone app to deliver the SaltSwitch intervention and collect data from participants, Wi-Fi enabled BP monitors, and a web-based study data management system. In this blog, we share some key lessons learnt, with the aim of informing similar future research using technology-based interventions and/or remote trial design.
Figure 1: SaltSwitch Smartphone Application
THE SALTS BLOOD PRESSURE LOWERING STUDY
SALTS was a randomised controlled trial. We aimed to recruit 326 people with high BP who owned a smartphone. We started with a two-week baseline period where people answered a questionnaire, gave a urine sample, scanned their food purchases, and took BP readings. This was followed by a 12-week intervention period. Participants were randomly allocated at the end of baseline to receive either the salt reduction programme (SaltSwitch smartphone app and dietary salt substitute) or generic information about heart healthy eating. The primary outcome of interest was urinary sodium excretion, a robust measure of dietary sodium intake. However, we were also interested in the effects of the salt reduction programme on other outcomes including BP, urinary potassium excretion, and the sodium content of packaged food purchases.
Figure 2, the Consolidated Standards of Reporting Trials (CONSORT) flow diagram, shows the flow of participants through the study.
Figure 2: Flow of Study Participants
The remote trial design was based around a purpose-built SALTS smartphone app (Figure 3). The app included procedures for people to give their consent, complete trial questionnaires, watch video tutorials on study processes, use a barcode scanner to record packaged foods purchased, and remind them when study tasks were due via a tailored participant calendar. The app also sent automated reminders directly to participants’ smartphones.
Figure 3: SALTS Study Smartphone Application
LESSONS LEARNED
We learned a lot of lessons from our trial. We recorded these lessons during research team meetings and debriefing sessions held throughout the study and following its completion. We have organised these lessons into the four stages of a clinical trial as outlined in the CONSORT flow diagram, an evidence-based, minimum set of recommendations for reporting randomised trials: (1) Enrolment, (2) Allocation, (3) Follow Up, and (4) Analysis.
Enrolment and allocation of participants
1. For older people, owning a smartphone does not necessarily equate to using apps
More than eight in 10 people aged 35 to 54 years and approximately half of those aged 55 years and over own a smartphone – given high BP is common in people 40 years and over, we anticipated a large pool of people would meet this criterion and be available to take part. However, we found that many adults aged 55 and over who were otherwise eligible were not familiar with apps. Therefore, participant use of smartphone apps, not just smartphones, should be carefully considered in studies or programmes requiring the use of this technology. This leads us to the next lesson:
2. Face-to-face support may be critical to success
When talking with participants during the baseline period, we realised that they needed more than written support to be able to connect and use the required technologies. We also discovered that people want to see researchers and develop a relationship with them – for many people, the human connection is a key reason they volunteered in the first place. We also know how important face-to-face communications (kanohi ki te kanohi) are for Māori. Excluding this element does not support digital inclusion and wellbeing in Aotearoa. We actually saved time when we introduced face-to-face clinic visits because people were able to work through the technology with study staff, ask questions, and develop a relationship with the team; and time spent trying to contact participants to complete study procedures was markedly reduced (Table 1).
Table 1: Eligibility and reachability of participants completing an initial face-to-face study visit vs. fully remote procedures
Study on-boarding procedure |
Face-to-face visit |
Remote |
Number referred |
127 |
78 |
Number eligible (%) |
59 (46%) |
20 (26%) |
Number declined (%) |
33 (26%) |
16 (21%) |
3. Interoperability between trial technologies is important
During trial enrolment, we found a problem with the interoperability of our various pieces of technology i.e., their ability to exchange and make best use of study information: we had a web-based referral tool that did not link to our study data management system, and a data management system that did not link to Microsoft Outlook or have an automated text message and calling log. This situation created a huge amount of additional work tracking potential participants by study staff. Unless a trial is large and very well-funded, it is worth considering the use of simple, existing tools such as REDCap survey software, and an ad-hoc text messaging service (see AWS for example) – they might not be as efficient as a fully interoperable trial system, but the money saved using existing methods could always be used to increase the number of study staff.
Following up participants and collecting outcome data
1. Use test plans and ‘live’ mock participants
Despite rigorous testing of technology during the development phase, once we had ‘real’ participants in the trial, unanticipated and ongoing changes to the technology were required. Reasons included the challenges of testing app features in test conditions, the large variety of smartphone models and operating systems on the market, and the impacts that fixing one issue in an app can have on the functionality of other aspects. We recommend the following strategies to mitigate these issues: one, ensure consensus in the research team regarding the best tool to achieve the study outcomes; two, create technology test plans alongside development plans and continue to retest after bug fixes throughout the live trial; three, have a “soft” launch to enable early rigorous testing with real participants; four, build software in separate blocks of code that only connect where necessary; five, include live ‘mock’ participants (monitored by study staff and using different smartphone models) throughout the entire trial period; and six, ensure IT support is available throughout the trial.
2. Use text, not notifications
We asked participants to turn on notifications for the study app when they downloaded it, but some people had reasons not to do so, and others did not view the notifications. Simple text messages worked much better than in-app notifications, especially for people with limited Wi-Fi or data on their devices.
3. Consider rigorous standardisation of home-based blood pressure measures
Blood pressure typically varies considerably throughout a given day for each individual, so we expected to find a large standard deviation (SD) around the mean (average) of each individual’s BP measures. We allowed participants to choose a time of day in the morning and evening that they preferred to record their BP. However, the variability in BP measures was even larger than we expected. In the future, we would further standardise the time for collection of home-based BP measures, especially if BP was the primary trial outcome.
Data analysis
1. Remote trials require realistic windows for the return of data
We collected the following data remotely: spot urine samples were couriered to researchers with chilled freezer packs, BP measures were collected via a Wi-Fi enabled BP monitor and sent automatically via a VPN to our networks, barcodes of packaged foods were scanned using a barcode scanner within the study app linked with our data management system, and surveys were also done on the app. To ensure validity, our study statistical analysis plan specified time frames for the return of these data e.g., at baseline, spot urine samples and BP measures had to be returned within two weeks of the week prior to randomisation. However, some participants took much longer than we anticipated, and required several follow-up calls before they returned their data. While the technology issues (and the COVID-19 pandemic and associated lockdowns) played a part in pushing out these time frames, we learned that it is important to set realistic timeframes and specify them prior to starting a trial.
SUMMARY
In summary, technologies such as smartphone apps, Wi-Fi-enabled data collection devices, and web-based data management systems offer the opportunity to deliver and assess the impact of clinical trial interventions remotely. While researchers are becoming increasingly interested in conducting online trials and are drawn to digital interventions, there are potential drawbacks, and even experienced teams will encounter unanticipated challenges. During the SALTS trial we learnt lessons across all phases of the Consolidated Standards of Reporting Trials (CONSORT) flow diagram. While we had incorporated contingencies in our funding and timeframes to allow for unforeseen circumstances, we could not anticipate all the challenges that emerged – including a global pandemic. Researchers should try to ‘keep it simple’ by limiting the complexity of technology-based interventions, even if they anticipate their target audience has adequate digital literacy i.e., do not assume everyone who owns a particular device knows how to use all the features and to the level required for the intervention, or indeed have the motivation to do so. Technology can create efficiencies and reduce participant and researcher burden but there are trade-offs with the time and cost of troubleshooting technology issues, particularly if they are critical to implementing the trial protocol.
Acknowledgements and Funding
SALTS was funded by a Health Research Council of New Zealand programme grant (18/672). Helen Eyles is a Senior Fellow of the Heart Foundation of New Zealand (#1843). We would also like to acknowledge members of the SALTS trial research team: Neela Bhana, Prof Rob Doughty, John Faatui, Michelle Jenkins, Dr Yannan Jiang, Bruce Kidd, Assoc Prof Rachael McLean, Prof Bruce Neal, Dr Nhung Nghiem, Derek Park, Prof Anthony Rodgers, Shistata Shrestha, Assoc Prof Lisa Te Morenga, and Prof Nick Wilson. Finally, thank you to Prof Chris Bullen and Karen Carter for contributing to the editing of this blog.