machine learning

See the following -

The Pistoia Alliance Calls on the Life Sciences to Support Greater Collaboration to Overcome Technology Challenges

Press Release | Pistoia Alliance | March 29, 2017

The Pistoia Alliance, a global, not for profit alliance that works to lower barriers to innovation in life sciences R&D, is calling upon the industry to improve collaborative efforts to use patient data to its full effect. In a series of keynote speeches delivered at The Pistoia Alliance’s annual member conference in London, speakers from Amgen, Accenture and AstraZeneca, discussed the need to more closely connect outcomes data with the R&D process – to help pharmaceutical companies focus their research efforts and deliver real benefits to patients. Building machine learning and deep learning systems, and incorporating data from therapeutic interventions or diagnostics into R&D is technologically challenging, and would benefit significantly from industry-wide pre-competitive collaboration...

The Rise of 'Technology-Enabled' Clinical Research Companies

Melissa Fassbender | Outsourcing-Pharma.com | January 17, 2017

Eric Hodgins, senior vice president, research and development technology solutions at QuintilesIMS, told us there are a number of dynamics “significantly transforming the industry and driving an increase in technology-enabled clinical research.” Notably, there are two macro trends: the pace of innovation in scientific research and the explosion of technological advancements...

Read More »

Tidelift and NumFOCUS partner to support essential community-led open source data science and scientific computing projects

Press Release | NumFOCUS, Tidelift | October 22, 2019

NumFOCUS, a nonprofit supporting better science through open code, and Tidelift today announced a partnership to support open source libraries critical to the Python data science and scientific computing ecosystem. NumPy, SciPy, and pandas-sponsored projects within NumFOCUS-are now part of the Tidelift Subscription. Working in collaboration with NumFOCUS, Tidelift financially supports the work of project maintainers to provide ongoing security updates, maintenance and code improvements, licensing verification and indemnification, and more to enterprise engineering and data science teams via a managed open source subscription from Tidelift.

Read More »

TIM Review’s Evolution from Ottawa Journal to International Resource

Craig Lord | Ottawa Business Journal | September 21, 2017

From its humble beginnings as the Open Source Business Resource to its status today as an internationally acclaimed journal for academics and businesspeople alike, the Technology Innovation Management Review has made its name on staying ahead of the curve. Tony Bailetti, director of Carleton University’s TIM program, launched the journal back in 2007. At the time, it was an experiment to uncover how business owners might make use of open-source applications...

Read More »

To Trust Artificial Intelligence, It Must Be Open And Transparent. Period.

Machine learning has been around for a long time. But in late 2022, recent advancements in deep learning and large language models started to change the game and come into the public eye. And people started thinking, “We love Open Source software, so, let’s have Open Source AI, too.” But what is Open Source AI? And the answer is: we don’t know yet. Machine learning models are not software. Software is written by humans, like me. Machine learning models are trained; they learn on their own automatically, based on the input data provided by humans. When programmers want to fix a computer program, they know what they need: the source code. But if you want to fix a model, you need a lot more: software to train it, data to train it, a plan for training it, and so forth. It is much more complex. And reproducing it exactly ranges from difficult to nearly impossible.

Read More »

Top 7 Open Source Business Intelligence and Reporting Tools

In this article, I review some of the top open source business intelligence (BI) and reporting tools. In economies where the role of big data and open data are ever-increasing, where do we turn in order to have our data analysed and presented in a precise and readable format? This list covers tools which help to solve this problem. Two years ago I wrote about the top three. In this article, I will expand that list with a few more tools that were suggested by our readers. Note that this list is not exhaustive, and it is a mix of both business intelligence and reporting tools...

Top 8 Open Source Artificial Intelligence (AI) Technologies in Machine Learning

Artificial intelligence (AI) technologies are quickly transforming almost every sphere of our lives. From how we communicate to the means we use for transportation, we seem to be getting increasingly addicted to them. Because of these rapid advancements, massive amounts of talent and resources are dedicated to accelerating the growth of the technologies. Here is a list of 8 best open source AI technologies you can use to take your machine learning projects to the next level.

Trends in Corporate Open Source Engagement

In 1998, I was part of SGI when we started moving to open source and open standards, after having been a long-time proprietary company. Since then, other companies also have moved rapidly to working with open source, and the use and adoption of open source technologies has skyrocketed over the past few years. Today company involvement in open source technologies is fairly mature and can be seen in the following trends...

Using It or Losing It? The Case for Data Scientists Inside Health Care

Marco D. Huesch, MBBS, PhD & Timothy J. Mosher, MD | NEJM Catalyst | May 4, 2017

As much as 30% of the entire world’s stored data is generated in the health care industry. A single patient typically generates close to 80 megabytes each year in imaging and electronic medical record (EMR) data. This trove of data has obvious clinical, financial, and operational value for the health care industry, and the new value pathways that such data could enable have been estimated by McKinsey to be worth more than $300 billion annually in reduced costs alone. If appropriate investments in data science are not made in-house, then hospitals and health systems will run the risk of becoming reliant on outsiders to analyze the data that ultimately will be used to inform decisions and drive innovation”...

Read More »

Using the Latest Advances in Data Science to Fight Infectious Diseases

One of the most dramatic shifts in recent years that is empowering epidemiologists to be more effective at their jobs is occurring due to improvements in data technologies. In the past, the old "relational" data model dictated that data had to be highly structured, and as a result treated in distinct silos. This made it difficult, if not impossible, to analyze data from multiple sources to find correlations. Epidemiologists would spend many minutes or even hours on each query they ran to get results back, which is unacceptable when you need to test dozens of hypotheses to try to understand and contain a fast-moving outbreak. (Imagine how you would feel if each one of your Google searches took 45 minutes to return!) By contrast, using newer technologies, the same queries on the same hardware can run in seconds. Read More »

What Is Deep Learning, and Why Should You Care About It?

Whether it's Google's headline-grabbing DeepMind AlphaGo victory, or Apple's weaving of "using deep neural network technology" into iOS 10, deep learning and artificial intelligence are all the rage these days, promising to take applications to new heights in how they interact with us mere mortals. To go deeper (yes, I went there) on the subject, I reached out to the team at the deep learning-focused company Skymind, creators of Deep Learning For Java (DL4J), and authors of the recently released O'Reilly book Deep Learning: A Practitioner's Approach, Josh Patterson and Adam Gibson...

White House Call to Action to the Tech Community on New Open Access Machine Readable COVID-19 Dataset

Press Release | White House | March 16, 2020

Today, researchers and leaders from the Allen Institute for AI, Chan Zuckerberg Initiative (CZI), Georgetown University's Center for Security and Emerging Technology (CSET), Microsoft, and the National Library of Medicine (NLM) at the National Institutes of Health released the COVID-19 Open Research Dataset (CORD-19) of scholarly literature about COVID-19, SARS-CoV-2, and the Coronavirus group. Requested by The White House Office of Science and Technology Policy, the dataset represents the most extensive machine-readable Coronavirus literature collection available for data and text mining to date, with over 29,000 articles, more than 13,000 of which have full text.

Read More »

WHO Releases Report on Emerging Technologies and Scientific Innovations

In early July 2023, the World Health Organization (WHO) issued its 2023 report on Emerging Technologies and Scientific Innovations: A Global Public Health Perspective. This insightful and detailed report is the result of strategic engagement with a panel of global health experts through the use of an online Delphi method, roundtable discussions, and key informant interviews. The purpose of this report is to identify innovations in research and emerging technologies that have the potential to impact global health in the next five to ten years.

Why Google Is Suddenly Obsessed with Your Photos

Victor Luckerson | The Ringer | May 25, 2017

Google tends to throw lots of ideas at the wall, and then harvest the data from what sticks. Right now the company is feasting on photos and videos being uploaded through its surprisingly popular app Google Photos. The cloud-storage service, salvaged from the husk of the struggling social network Google+ in 2015, now has 500 million monthly active users adding 1.2 billion photos per day. It’s on a growth trajectory to ascend to the vaunted billion-user club with essential products such as YouTube, Gmail, and Chrome. No one is quite sure what Google plans to do with all of these pictures in the long run, and it’s possible the company hasn’t even figured that out...

Read More »

Why openly available abstracts are important - overview of the current state of affairs

The value of open and interoperable metadata of scientific articles is increasingly being recognized, as demonstrated by the work of organizations such as Crossref, DataCite, and OpenCitations and by initiatives such as Metadata 2020 and the Initiative for Open Citations. At the same time, scientific articles are increasingly being made openly accessible, stimulated for instance by Plan S, AmeliCA, and recent developments in the US, and also by the need for open access to coronavirus literature. In this post, we focus on a key issue at the interface of these two developments: The open availability of abstracts of scientific articles. Abstracts provide a summary of an article and are part of an article's metadata. We first discuss the many ways in which abstracts can be used and we then explore the availability of abstracts. The open availability of abstracts is surprisingly limited. This creates important obstacles to scientific literature search, bibliometric analysis, and automatic knowledge extraction.

Read More »