Posit AI Blog


Posit AI Blog

Introducing mall for R...and Python

Python
LLM
Polars
Natural Language Processing
Tabular Data

We are proud to introduce the {mall} package. With {mall}, you can use a local LLM to run NLP operations across a data frame. (sentiment, summarization, translation, etc). {mall} has been simultaneously released to CRAN and PyPi (as an extension to Polars).

Introducing Keras 3 for R

TensorFlow/Keras

We are thrilled to introduce {keras3}, the next version of the Keras R package. {keras3} is a ground-up rebuild of {keras}, maintaining the beloved features of the original while refining and simplifying the API based on valuable insights gathered over the past few years.

News from the sparkly-verse

Packages/Releases
Spark

Highlights to the most recent updates to `sparklyr` and friends

Chat with AI in RStudio

Generenative Models
Packages/Releases

Interact with Github Copilot and OpenAI's GPT (ChatGPT) models directly in RStudio. The `chattr` Shiny add-in makes it easy for you to interact with these and other Large Language Models (LLMs).

Hugging Face Integrations

Torch
Releases

Hugging Face rapidly became a very popular platform to build, share and collaborate on deep learning applications. We have worked on integrating the torch for R ecosystem with Hugging Face tools, allowing users to load and execute language models from their platform.

Understanding LoRA with a minimal example

Torch
Concepts

LoRA (Low Rank Adaptation) is a new technique for fine-tuning deep learning models that works by reducing the number of trainable parameters and enables efficient task switching. In this blog post we will talk about the key ideas behind LoRA in a very minimal torch example.

GPT-2 from scratch with torch

Torch
Natural Language Processing

Implementing a language model from scratch is, arguably, the best way to develop an accurate idea of how its engine works. Here, we use torch to code GPT-2, the immediate successor to the original GPT. In the end, you'll dispose of an R-native model that can make direct use of Hugging Face's pre-trained GPT-2 model weights.

What are Large Language Models? What are they not?

Meta
Concepts
Natural Language Processing

This is a high-level, introductory article about Large Language Models (LLMs), the core technology that enables the much-en-vogue chatbots as well as other Natural Language Processing (NLP) applications. It is directed at a general audience, possibly with some technical and/or scientific background, but no knowledge is assumed of either deep learning or NLP. Having looked at major model ingredients, training workflow, and mechanics of output generation, we also talk about what these models are not.

safetensors 0.1.0

Packages/Releases

Announcing safetensors, a new R package allowing for reading and writing files in the safetensors format.

torch 0.11.0

Torch
Packages/Releases

torch v0.11.0 is now on CRAN. This release features much-enhanced support for executing JIT operations. We also amended loading of model parameters, and added a few quality-of-life improvements, like support for temporarily modifying the default torch device, support for specifying data types as strings, and many more.

LLaMA in R with Keras and TensorFlow

TensorFlow/Keras
Generative Models
Natural Language Processing

Implementation and walk-through of LLaMA, a Large Language Model, in R, with TensorFlow and Keras.

Group-equivariant neural networks with escnn

Torch
Concepts
Image Recognition & Image Processing

Escnn, built on PyTorch, is a library that, in the spirit of Geometric Deep Learning, provides a high-level interface to designing and training group-equivariant neural networks. This post introduces important mathematical concepts, the library's key actors, and essential library use.

luz 0.4.0

Torch
Packages/Releases

luz v0.4.0 is now on CRAN. This release adds support for training models on ARM Mac GPUs, reduces the overhead of using luz, and makes it easier to checkpoint and resume failed runs.

torch 0.10.0

Torch
Packages/Releases

torch v0.10.0 is now on CRAN. This version upgraded the underlying LibTorch to 1.13.1, and added support for Automatic Mixed Precision. As an experimental feature, we now also support pre-built binaries, so you can install torch without having to deal with the CUDA installation.

De-noising Diffusion with torch

Torch
Image Recognition & Image Processing
Generative Models

Currently, in generative deep learning, no other approach seems to outperform the family of diffusion models. Would you like to try for yourself? If so, our torch implementation of de-noising diffusion provides an easy-to-use, easy-to-configure interface.

Deep Learning and Scientific Computing with R torch: the book

Torch
Meta
Concepts
Packages/Releases

Please allow us to introduce Deep Learning and Scientific Computing with R torch. Released in e-book format today, and available freely online, this book starts out by introducing torch basics. From there, it moves on to various deep-learning use cases. Finally, it shows how to use torch for more general topics, such as matrix computations and the Fourier Transform.

Implementing rotation equivariance: Group-equivariant CNN from scratch

Torch
Spatial Data
Image Recognition & Image Processing

We code up a simple group-equivariant convolutional neural network (GCNN) that is equivariant to rotation. The world may be upside down, but the network will know.

Upside down, a cat's still a cat: Evolving image recognition with Geometric Deep Learning

Torch
Spatial Data
Concepts

In this first in a series of posts on group-equivariant convolutional neural networks (GCNNs), meet the main actors — groups — and concepts (equivariance). With GCNNs, we finally revisit the topic of Geometric Deep Learning, a principled, math-driven approach to neural networks that has consistently been rising in scope and impact.

AO, NAO, ENSO: A wavelet analysis example

Torch
Packages/Releases
Time Series

El Niño-Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), and Arctic Oscillation (AO) are atmospheric phenomena of global impact that strongly affect people's lives. ENSO, first and foremost, brings with it floods, droughts, and ensuing poverty, in developing countries in the Southern Hemisphere. Here, we use the new torchwavelets package to comparatively inspect patterns in the three series.

Wavelet Transform - with torch

Torch
Concepts

torch does not have built-in functionality to do wavelet analysis. But we can efficiently implement what we need, making use of the Fast Fourier Transform (FFT). This post is a very first introduction to wavelets, suitable for readers that have not encountered it before. At the same time, it provides useful starter code, showing an (extensible) way to perform wavelet analysis in torch. It is an excerpt from the corresponding chapter in the forthcoming book, Deep Learning and Scientific Computing with R torch, to be published by CRC Press.

torch 0.9.0

Torch
Packages/Releases

torch v0.9.0 is now on CRAN. This version adds support for ARM systems running macOS, and brings significant performance improvements.

Discrete Fourier Transform - with torch

Torch
Concepts

About the Fourier Transform, it has been said that it is one of the greatest wonders of the universe. At the same time, it can be realized in a mere half-dozen lines of code. Even if in the end, you're just going to call torch's built-in functions directly, it helps to understand, and be able to reproduce in code, the ideas that underlie the magic. This post is an excerpt from the forthcoming book, Deep Learning and Scientific Computing with R torch, to be published by CRC Press.

Five ways to do least squares (with torch)

Torch
Concepts
Tabular Data

Get to know torch's linalg module, all while learning about different ways to do least-squares regression from scratch. This post is a condensed version of the corresponding chapter in the forthcoming book, Deep Learning and Scientific Computing with R torch, to be published by CRC Press.

Audio classification with torch

Torch
Audio Processing

Learn how to classify speech utterances with torch, making use of domain knowledge and deep learning. This post is a condensed version of the corresponding chapter in the forthcoming book, Deep Learning and Scientific Computing with R torch, to be published by CRC Press.