Cormac Rea

Building a VR Community: DSI Hosts Second Annual Questioning Reality Conference

By: Cormac Rea

Photo: Justin Lenis Photography

Leading scholars, industry professionals and VR enthusiasts again convened at the second annual Questioning Reality: Explorations of Virtual Reality (VR) and our Social Future conference – a three-day conference to explore the future of virtual reality (VR) and its impact on social interactions in mediated environments, encompassing VR, augmented reality (AR), extended reality (XR), mixed realities (MR) and the next generation of AI driven immersive environments  

Hosted by the Data Sciences Institute (DSI) — the University of Toronto multidisciplinary hub for data science innovation and collaboration — the conference was co-led by the DSI’s Bree McEwan, a professor in the Institute for Communication, Culture, and Information Technology (ICCIT) at the University of Toronto Mississauga and Sun Joo (Grace) Ahn, director of the Center for Advanced Computer-Human Ecosystems and professor at the University of Georgia. 

“We look forward to welcoming new ideas, new synergies and discussion at this edition of Questioning Reality. With the AI boom, things that were not possible – even months ago – are now possible. We want to lean into this space and start this discussion of how generative AI can shape communication and interactions in immersive spaces,” said Ahn. 

“The connection between VR and data science is intertwined to the extent that – when we get into investigating VR – everything is data,” noted McEwan.  

“Our mission is to connect the people doing the work behind data science – engineers, computer science, data science etc. – with the people who are developing and exploring areas related to VR and its impact.”

The conference began with a series of mini-grant lightning talks, featuring research teams that had received DSI grants following the 2024 Questioning Reality conference. Insights were shared into the effect of using VR to manage emotion regulation, perceptual conflicts during social interactions, and as a teaching tool – both for VR driven applications and as a method of educational delivery in virtual classrooms.   

The panel included Josh Baldwin (University of Georgia); Eugy Han (Stanford University); Tim Huang (University of Pittsburgh) and Kristine Nowak (University of Connecticut). 

“[Our] project examines how asymmetrical access to VR affects learning, engagement etc. for students,” said Huang. 

“We have already produced some initial findings at individual level, for instance that people have better visual learning with VR but non-VR users have greater auditory gains and lower cognitive load, and we hope to learn more as our research progresses.” 

The conference featured a keynote presentation on immersive work and collaboration in the financial sector by Dr. Blair MacIntyre, Global Head of Immersive Technology Research, Global Technology Applied Research, JP Morgan Chase. 

In a talk entitled, Social XR and the Enterprise, Macintyre discussed immersive presentations for financial and wealth advisors, immersive counterspaces for mentoring meeting and supporting networks for use during hybrid conference experiences.

On day two of the conference, attendees were able to hear directly from panelists in government, philanthropic organizations and academic regarding their respective criteria for funding VR and immersive technology research.  

The panel was comprised of Joshua Greenberg (Program Director, Digital Information Technology, Sloan Foundation), Alison Krepp (Social Science Program Manager, National Oceanic and Atmospheric Administration), Sylvie Lamoureux (Vice President, Research Programs, Social Sciences and Humanities Research Council) and moderated by Mia Wong (University of Colorado), a Questioning Reality Fellow. 

“At Sloan, our north star is advancing scientific research,” said Greenberg. “Science is a social collaborative effort and after 2020 we began to think more intentionally in the foundation about remote social experiences. The question at Sloan becomes, how do we turn that into a program strategy, how do we understand human behaviour in immersive environments?”  

“When I go back to my board and explain why we fund the DSI’s Questioning Reality, it is to explain how we are helping facilitate scientific advancement through technology.” 

Questioning Reality is supported by the Alfred P. Sloan Foundation, a not-for-profit, mission-driven grantmaking institution dedicated to improving the welfare of all through the advancement of scientific knowledge. The grant was awarded to the DSI to delve into VR technology and its profound implications for human interaction and communication.

The second conference keynote was led by Dr. Pablo Pérez (Nokia eXtended Reality Lab, Madrid), and included a fireside chat with Grace Ahn and reception – co-sponsored by U of T’s Schwartz Reisman Institute for Technology & Society (SRI). 

Topics of discussion included using VR and immersive technology for: remote learning and work; interaction and privacy with new users and immersive tech; immersive communication; AI & telepresence, as well as accessibility, health and remote assistance.  

“The future that we envision is to create a reality where people that are far from each other can connect, to link local realities,” explained Pérez.  

“As we try to shape our research to that aim at Nokia, we are always attempting to create a better way to connect, to create technology that helps the world act together.”

Photo: Questioning Reality 2025 attendees (credit: Data Sciences Institute) 

A highlight of the final day of the conference was a panel discussion entitled, Building VR Labs. Panelists addressed the challenges of building VR labs and doing research with technology, as well as how to effectively balance research and marketing or operations needs at VR/ XR labs.  

The panel included: Grace Ahn (University of Georgia), Tammy Lin (National Chengchi University), Tony Liao (University of Houston) and Kristine Nowak (University of Connecticut). 

“The keyword [to building labs and centres] is sustainability,” said Ahn. “Most start-ups fail after their initial surge because once you get big, the amount of funding that is needed is enormous.”  

“Once you go big, there is a lot of effort to sustain the organization and ensure you don’t implode,” she added.  

“You have to think about how to grow an organization and stay nimble so you can pivot in a funding situation like we experience in universities. The vision of what you want to build needs to be deliberate.” 

Discussions from the conference will be reflected in a new edition of Debates in Digital Media focused on social virtual reality. Collaborative tams were formed to work on projects to be presented at future Questioning Reality conferences. The Questioning Reality conference and Sloan Foundation grant serve as a beacon of support and recognition for the DSI’s commitment to pushing the boundaries of knowledge and innovation in the data sciences.

Leadership Spotlight: Meredith Franklin

Prof. Meredith Franklin joins Data Sciences Institute (DSI) as Associate Director, Joint Initiatives

By: Cormac Rea

Get to know Professor Meredith Franklin, who joined the Data Sciences Institute (DSI) as the Associate Director, Joint Initiatives this year. 

Franklin is an Associate Professor jointly appointed in the Department of Statistical Sciences and School of the Environment, Faculty of Arts & Science at the University of Toronto. She is also the Master of Science in Applied Computing (MScAC), Data Science concentration lead. 

In the Associate Director, Joint Initiatives role at the DSI, Franklin will be responsible for developing joint programming opportunities with other university units. She will draw on her substantial experience developing educational data science programs that have leveraged offerings across departments, faculties and external partners, creating opportunities for students to learn skills that are applicable in the classroom and on-the-job. 

Franklin’s own interdisciplinary research centres on using data science to better understand how the physical environment affects public health. She has been a leader in developing spatiotemporal methods that leverage large ground- and space-based datasets to characterize human exposures to environmental factors including air pollution, wildfires, oil and gas flaring, noise, artificial light at night, and greenspace.   

How did you first become aware of the DSI and what led to your role as Associate Director, Joint Initiatives? 

I became familiar with the Data Sciences Institute (DSI) shortly after arriving at the University of Toronto, in part due to its strong ties with the Department of Statistical Sciences. From the beginning, I was impressed by the breadth of opportunities the DSI provides for students and postdoctoral researchers.  

When Lisa Strug asked me to join the DSI, I didn’t hesitate for a moment. I feel that my research and teaching closely align with the institute’s mission. I am deeply committed to ensuring that data science maintains a strong, visible presence at the University of Toronto. The DSI serves as a flagship institute in this space, and its reputation across campus speaks volumes. I am genuinely excited and proud to be part of it. 

Please speak to the research you do with your cross-appointment in the School of the Environment and Department of Statistical Sciences? 

My research is deeply grounded in data science, with a strong emphasis on machine learning and AI tools.  

I primarily work in environmental exposure assessment, where I integrate data from a range of sources including ground measurements, space-based satellite instruments, and climate models to estimate human exposures to environmental hazards. These exposure estimates are then used in environmental health and epidemiological study to better understand how environmental factors affect health outcomes. 

A central focus of my work is on air quality, specifically assessing pollutants such as particulate matter, ozone, and nitrogen dioxide. I develop high-resolution spatiotemporal exposure models at regional to global scales, which are critical for supporting large-scale epidemiological investigations into the health impacts of air pollution. 

What role does AI play in your data science protocols for research? 

Data science and AI play a central role in my research. Much of my work has pioneered the use of satellite images for environmental applications, which requires processing vast amounts of data with sophisticated tools to extract meaningful insights. Several years ago we began using neural networks to generate exposure estimates from satellite images, and since then we have expanded our approaches to incorporate state-of-the-art AI techniques including transfer learning and generative models. While these methods are often associated with large language models, we have been adapting them for environmental data applications.    

Transfer learning, in particular, has been instrumental in managing the challenge of working with large volumes of satellite imagery when only limited ground-truth measurements are available. By training models on available data and applying them to broader domains, we are able to generate robust predictions beyond the original training set. Generative AI has similarly enhanced our work, enabling us to produce high-resolution exposure maps from lower-resolution satellite data. Together, these techniques allow us to generate realistic, spatially and temporally detailed environmental exposure estimates. 

We are also incorporating physics into AI through physics-informed neural networks, a novel and increasingly important approach in environmental modeling. By embedding physical processes, such as advection and diffusion, as partial differential equations within the network architecture, we can ensure that the predicted evolution of air pollutant concentrations over space and time remains physically realistic and scientifically credible. 

Ultimately, our goal is to build AI systems that do more than just fit the data. We want them to respect the underlying constraints and structure of the physical world to produce estimates that are both accurate and credible within the field of environmental exposure science. 

I believe that the responsible application of AI requires a strong foundation in data science and statistical principles. Understanding underlying data structures, model assumptions, and statistical reasoning is essential for applying advanced AI tools effectively. In my view, these foundational elements must be fully integrated into any scientific use of AI, particularly in fields like environmental modeling, where rigor, transparency, and interpretability are critical. 

Would you tell us a little about your experience building data science educational programs? 

I came to the University of Toronto just three years ago, after spending nearly 12 years at the University of Southern California (USC). At USC, I served as a faculty member in Biostatistics and, around 2018–2019, led the development of a new Public Health Data Science program. Setting up the program required extensive collaboration across departments at USC. 

Our aim was to focus on applied data science within the specific domain of public health where there was a clear and growing need. While USC already had a data science program based in the computer science department, our aim was to develop a professional master’s program targeted toward students who had quantitative backgrounds in domains outside of statistics and computer sciences.  

Bridging different disciplines and organizational structures was a key part of launching a successful program. We developed new courses, leveraged existing ones, and partnered with computer science to offer students a program that merged technical and theoretical rigor with the applied needs of data science students. 

The first cohort began in 2020—a challenging time to launch a new program in the midst of the COVID-19 pandemic, but we managed successfully. Through this experience, I gained valuable expertise in building interdisciplinary programs from the ground up. I look forward to bringing that experience to the DSI, helping to develop new data science programming and training initiatives by collaborating with multiple units to create opportunities that meet the evolving needs of students. 

Please comment on the role that data science plays in your current work with respect to training that you develop or teach? 

Currently I teach a data science course for undergraduate students in the Joint Statistics and Computer Science Program. In developing this course, I built upon the introductory graduate-level course I offered as part of the USC data science program, adapting it to suit the needs and skill levels of undergraduates. 

In developing data science training programs, my focus is not only on theory but also on preparing students with practical skills they need to succeed in the workforce. I strive to include tools and techniques that are often not covered in traditional coursework, such as accessing and working with real-world data, querying APIs, web scraping and parallel processing tools needed for managing large and complex datasets. These are critical skills for both scientific research and industry careers. 

It’s important to me to stay closely connected to industry trends and ensure the tools and methods I teach remain current and relevant. I update my course materials every year to reflect advances in the field and to respond to the evolving needs of students aiming for careers in data science. My goal is to equip students with both strong foundational knowledge and the hands-on skills that will make them competitive in the job market.

DSI-Supported Research Team Links EV Sales to Childhood Asthma Reduction

Effect of EV sales on childhood asthma rates. Photo provided by Harshit Gujral, Meredith Franklin, Steve Easterbrook (no reuse) 

By: Cormac Rea

As Electric Vehicles (EVs) have become a more familiar sight on our streets and highways, one may wonder – has there been a corresponding effect on public health from reduced traffic pollution?

In a paper recently published in the journal, Environmental Research, a Data Sciences Institute (DSI)funded research team showed that EV sales in the US have had a positive and measurable impact on childhood asthma cases.

“There are many policies in the US that are specifically focus on reducing the burden of asthma, but none of these policies directly address the asthma cases stemming from traffic-related air pollution,” said DSI Doctoral Student Fellow, Harshit Gujral, lead-author of the paper entitled, Emerging evidence for the impact of Electric Vehicle sales on childhood asthma: Can ZEV mandates help?

“Previous research shows that around 18-42 per cent, which is a huge number, of all cases of childhood asthma are attributed to traffic-related air pollution,” he added. “So clearly there is this gap between what is covered in the major policies and actual effectiveness in reducing asthma.”

Along with co-authors and DSI proposal supervisors – Steve Easterbrook (Department of Computer Science, Faculty of Arts and Science, University of Toronto), Meredith Franklin (Department of Statistical Sciences, Faculty of Arts and Science, University of Toronto) and  Paul Kushner (Department of Physics, Faculty of Arts and Science, University of Toronto) – Gujral and team were able to employ a cross-department approach that leveraged expertise across various areas and departments.

“It was clear in this case how important it was to bring together different skills and work collaboratively,” said Franklin.

“A key component was wrangling multiple nationwide data sets from various sources and making them work together in one cohesive analysis.”

Gujral also highlighted the DSI funding as creating a pathway for researchers to focus on their work over a sustained period without distraction.

“The DSI provided three-year funding, which meant that I have not had to apply for the funding every year and could just focus on my research and outcomes,” he said.

“The DSI funding created the bandwidth to do exactly that.”

Using childhood asthma as a proxy due to its widespread impact on the population, the research team relied on publicly available datasets from the U.S. Centers for Disease Control and Prevention from 2013-2019, as well as independently obtained EV sales data

“Employing linear mixed models from data science, we were able to find the associations between the sales of EVs and the cases of asthma due to traffic-related air pollution,” explained Gujral.

The research team found that for every 1,000 new gas-powered vehicles sold, there was one new case of childhood asthma. The team also found that replacing approximately 21 per cent of these sales with electric vehicles appeared to be sufficient to halt rising asthma rates caused by new vehicle sales. However, this number varied depending on the state and various factors — such as population density and the number of existing gas-powered vehicles on the road.

For instance, in some states, replacing just seven per cent of gas car sales with electric vehicles might be enough to halt rising asthma rates caused by new vehicle sales. But in other states, 42 per cent of new car sales had to be electric vehicles in order to have any impact.

“A fundamental finding of this research is that the health impacts of EVs will only manifest when EVs replace existing non-EV vehicles,” said Gujral. “If one simply adds more EVs on the road, it might not result in same health benefits.”

The research team’s findings indicate there’s already a measurable public health benefit being seen in the U.S. from the increase of electric vehicles on the road.

“A 36-77 per cent fleet share of electric vehicles should minimize the asthma burden due to reducing the amount of nitrogen dioxide emitted from gas-powered automobiles, but this doesn’t eliminate all the pollutants that are produced by EVs,” said Gujral.

“Next we want to go to the level of ZIP code to understand this problem a bit more and, at the same time, look further at the socioeconomic implication as low-income communities are the ones who are the most disproportionately impacted by traffic-related air pollution.”

Questioning Reality: Exploring the Future of Virtual Reality and Impact on Social Interactions 

Photo courtesy of Pablo Perez, Nokia XR Labs, Madrid, Spain

By: Cormac Rea

The Data Sciences Institute (DSI) at the University of Toronto hosts the annual Questioning Reality: Explorations of Virtual Reality conference where leading scholars, industry professionals, and VR enthusiasts are invited to discuss the future of virtual reality (VR) and its impact on social interactions.

The conference is led by Bree McEwan, DSI lead for Responsible Data Science and Associate Professor in the Institute for Communication, Culture, Information and Technology at the University of Toronto Mississauga and Sun Joo (Grace) Ahn, Director of the Center for Advanced Computer-Human Ecosystems and Professor at the University of Georgia.

The 2025 Questioning Reality conference will feature speaker, Dr. Pablo Pérez on April 24, a unique researcher in the extended reality (XR) field. Pérez has a deep understanding of both technical challenges and social communication processes related to improving human interactions via immersive technologies.

Pérez is the lead researcher in Nokia’s XR labs in Madrid, Spain, drawing on his extensive experience in both academic and industry environments. His work helps us to understand the way that visual images and communication processes come together to create rich and meaningful co-presence in mediated environments.

Profs. McEwan and Ahn invited Dr. Pérez to speak on the challenges and opportunities for the VR field. VR as artificial intelligence (AI) is integrated into VR social experiences, including generative imagery and large language models that run virtual agents.

“Developments in artificial intelligence will drive the next generation of immersive environments, whether it is making the metaverse come alive through virtual imagery generated in real-time or interacting with virtual agents who might populate these virtual scenes,” says Prof. McEwan.

“Dr. Perez’s research stands at the bleeding edge of interdisciplinary inquiries of AI, its integration into metaverse spaces, and social interactions between humans and machines. I have been following his research with interest for quite some time now and we are delighted to have him join the Questioning Reality 25 conference,” says Prof. Ahn.

In advance of his talk, DSI spoke with Dr. Pérez about the “Realverse,” XR, the “realism” that AI can bring to social interactions and the concerns that society should have about these technologies. 

Click here for event registration and further information about speaker, Dr. Pablo Pérez. 

What drew you to research extended reality and the “Realverse”?  

Eight years ago, Nokia launched a new research lab in Madrid to investigate the end-to-end delivery of VR and AR. At that time, we were looking for a research direction which might have impact in the long term, in a similar way as smartphones revolutionized our lives. And then I asked myself: which reason could lead my 70-year-old mother to wear a VR headset? The only answer which came to me: to visit my brother, who lives abroad. This was the inspiration to explore the potential of XR technologies in bringing people together. 

What types of experiences are better suited to XR and immersive technologies than the physical world?   

I don’t think that any technology can be better than a face-to-face communication. But what XR can do is helping us break some barriers that we eventually encounter when communicating. The most obvious one is the distance. Telegraphy made it possible to have instantaneous news distribution around the world. Telephony extended this capability to personal communications. Video calls have made face-to-face conversations possible. XR can bring a next step, where I not only see your face when talking to you, but I can see what you see and share your space. This has an enormous potential to connect people, but it also has tremendous economic implications. Imagine that you could hold a remote meeting, or set up a remote workplace, exactly with the same effectiveness as in-person. This would change everything. 

How can social XR be designed to highlight the “human” side of communication, like emotions and support?   

Distance is not the only barrier to overcome; mediated communication makes it difficult to convey emotional cues such as face or body language. But it also provides an advantage: there is already a device which is taking part in communication, so we can use the power of artificial intelligence to augment our emotional intelligence. The key here is how we address the problem: not using the system to gain advantage over the other, to try to detect what the other is trying to hide, but to gain agency in the emotions that we want to include in the conversation. Let’s give a couple of examples. An XR system could be trained to detect and code my emotional cues and represent them in a different way. When I smile, it could subtly modify the environment to display a warmer color palette, for instance. This would help me express my emotions in a way that I control. A second example is personalized emotion regulation. The system could be trained to detect the moments where I am getting overly emotional in the conversation, such as when I get too angry, and alert me so that I can rethink what I am doing and let my long-term rational sense take over. This would be hacking the fast-thinking system and letting the slow-thinking system kick-in when needed, in the terms of Daniel Kahneman. Note that in both examples the user has full control over the system and its outcomes, there is no unethical intrusion in other’s inner state. This is the key. 

How realistically can AI-based agents simulate social interactions in virtual environments?   

The explosion of large language models has shown that it is relatively easy for a virtual agent to communicate in natural language. In a way, simulating a social interaction is an almost-solved problem in a text chat. Translating this into a virtual environment requires solving two problems: the interface and the role. Regarding the interface, LLMs currently operate mostly with discrete blocks of text or multi-modal inputs, but this is not how a conversation works. Next-generation agents should be able to continuously process a flow of information and decide when and how to take part in the conversation, including interrupting, taking turns, and deciding on what to do at any time. This is not an extremely hard problem, but it is not solved yet. The second problem is understanding what the role of a virtual agent in the conversation should be. AI-based agents are already being incorporated as NPCs in gaming, or as support systems in customer attention. But social XR could bring new use cases, for instance as personalized agents that could be used for asynchronous communication. Imagine that, instead of sending you a recorded message, I send you a representation of myself which tells you the message and it is also able to have a conversation about it, because it knows the context of the message itself. It won’t be equivalent to being there in person, but it could be better than not being present at all. 

But how can this technological toolbox be strategically leveraged to find the “killer app” that drives widespread adoption of XR communication?   

A big problem with XR technologies is that the “wow effect” makes people evaluate very indulgently the first impressions of the technology but, in the long run, users get quickly tired of wearing an HMD regularly. As a side-effect, XR devices and applications are normally designed for “geeks”: you need to have a strong adaptation period to be able to handle XR devices regularly. It might not be obvious if you are a frequent technology user, but it appears quickly when you try to make a non-technical person use XR. So probably it would be better to design the system together with people who are not able or willing to adapt. In our lab, we have learned a lot by using our systems with old adults or with people with intellectual disabilities. Now we think that any long-term vision must be first validated by and, when possible, co-designed with users that are going to experience difficulties with your technology. By adopting an inclusive-by-design approach, XR technology can enhance human communication by addressing individual limitations and augmenting personal capabilities—effectively providing each user with personalized “superpowers” that improve accessibility and empathy in daily interactions.  

What concerns should society have about these technologies?   

XR technologies can augment the way we communicate, which is in principle positive, but of course is not free from risks. The good news is that those risks are basically the ones that are already identified in other technological flavours. All the concerns about social media problems and overuse of screens, such as losing the connection with reality, privacy issues, echo chambers, loss of attention span…, will still be there for social XR. It is key for the research community to address them upfront, so that we steer the development of XR in the direction of mitigating them, instead of reinforcing them. 

This talk and reception are co-sponsored by the Alfred P. Sloan Foundation and U of T’s Schwartz Reisman Institute for Technology & Society (SRI).

The Sloan Foundation is a not-for-profit, mission-driven grantmaking institution dedicated to improving the welfare of all through the advancement of scientific knowledge.

SRI’s mission is to deepen knowledge of technologies, societies, and what it means to be human by integrating research across traditional boundaries and building human-centred solutions that really make a difference.

The talk is hosted at the Schwartz Innovation Campus at the heart of Toronto’s innovation district. 

Drawing on AI and Other Data Sciences to Design Next-Gen Joint Replacements

Photo courtesy of Faculty of Applied Sciences and Engineering, University of Toronto (credit: Neil Ta)

By: Cormac Rea

A major challenge for the Canadian healthcare system involves creating biomedical implants such as knee and hip replacements that will not require extensive follow-up or revision surgery. The demand for expensive revision surgeries continues to grow as the population ages, so there is an urgent need to reduce the revision rates. When University of Toronto researcher, Yu Zou, learned of the problem – he wanted to help.  

“I’m a material scientist and really want to make materials that are useful to people and society,” said Zou.  

Zou also needed to understand how and why implants fail, as post-surgery complications are attributed to various failure modes of implant materials are also associated with patients’ identity factors, such as sex, age, physical disability, activity level, and body mass index, as well as the regions that patients live in. An interdisciplinary team was needed to employ sound data science methods to identify these variables from national health data sets.  

“I had a chance to speak with some hospital doctors and they told me there can be problems with the materials, specifically the durability of implants. Millions of dollars from the healthcare system are spent on joint replacements, often leading to revision surgery if certain parts don’t work well,” he added.  

Supported by a Data Sciences Institute catalyst seed grant, Professors Zou (Associate Professor, Faculty of Applied Science & Engineering, University of Toronto), Qiang Sun (Associate Professor Department of Statistical Sciences and Department of Computer Science, University of Toronto), and Adele Changoor (Staff Scientist Orthopaedic Surgery, Lunenfeld-Tanenbaum Research Institute and Assistant Professor, Department of Laboratory Medicine & Pathology, Temerty Faculty of Medicine, University of Toronto) came together to employ data science methodologies combined with AI tools to analyze massive datasets on joint replacement patients to help design complex microstructure materials. 

In developing new implants, Zou’s team needed to work with expensive materials that were more common in aerospace or airplane engineering to come up with microstructures that could provide the necessary strength, durability but lower elastic modulus required of a human joint.   

“In our lab we use data science tools and AI tools together to help us develop and manufacture new generation materials for extreme environments,” said Zou.  

Using data science insights from the hip and knee replacement revision surgery data registries, the researchers created algorithms to help drive insights from machine learning tools, in turn expediting the development of new implant materials. 

“It is just like ‘cooking’ meals,” said Zou. “We tried something and tested it, tried something different and tested again, and so on. So initially the efficiency was very low and there was a very high cost, both in terms of the funding required but also the time cost for those working on the project.”  

“Statistics and AI can streamline the lengthy trial-and-error process, narrowing thousands of possibilities down to a select few best options,” added Sun. 

“In this way, we only need to test about ten samples instead of thousands. This greatly shortens the research cycles and associated costs,” concluded Zou.  

The researchers continue to develop the data sets and necessary microstructures with the intent of further developing partnerships with hospitals, with a vision to develop a product that can be used by frontline hospital clinicians. Given that patient-specific biology (e.g. bone density, activity levels) contributes to implant survivability, the long-term goal is to build open-source tools for clinicians to be able to easily use at hospitals.  

“In the future, doctors could possibly visualize an accelerated simulation of the joint implant’s suitability, based on the patient, and see how the materials would change or degrade over five, ten or twenty years,” revealed Zou. 

With preliminary results of their research in place, Zou’s team was successful in applying for external funding in 2024.  

“The initial support funding from DSI was very helpful in securing external funding streams,” said Zou. “The New Frontiers in Research Fund from the federal government will support us in our work for another two years.”  

“Statistics and data sciences, including AI, have the potential to transform fields that heavily rely on trial-and-error approaches,” said Sun. “Their impact will likely be seen across many disciplines.”