Published by the Foundation for Open Access Statistics Editors-in-chief: Bettina Grün, Torsten Hothorn, Rebecca Killick, Edzer Pebesma, Achim Zeileis    ISSN 1548-7660; CODEN JSSOBK
[test by reto]
Authors: Michael Kane, John W. Emerson, Stephen Weston
Title: Scalable Strategies for Computing with Massive Data
Abstract: This paper presents two complementary statistical computing frameworks that address challenges in parallel processing and the analysis of massive data. First, the foreach package allows users of the R programming environment to define parallel loops that may be run sequentially on a single machine, in parallel on a symmetric multiprocessing (SMP) machine, or in cluster environments without platform-specific code. Second, the bigmemory package implements memory- and file-mapped data structures that provide (a) access to arbitrarily large data while retaining a look and feel that is familiar to R users and (b) data structures that are shared across processor cores in order to support efficient parallel computing techniques. Although these packages may be used independently, this paper shows how they can be used in combination to address challenges that have effectively been beyond the reach of researchers who lack specialized software development skills or expensive hardware.

Page views:: 6740. Submitted: 2012-09-05. Published: 2013-11-20.
Paper: Scalable Strategies for Computing with Massive Data     Download PDF (Downloads: 5752)
bigmemory.4.4.5-1.tar.gz: R source package Download (Downloads: 471; 183KB)
foreach.1.4.1-1.tar.gz: R source package Download (Downloads: 425; 351KB) Replication R code for examples from the paper Download (Downloads: 418; 2KB)
Airline.tar.bz2: data set Download (Downloads: 642; 1GB)

DOI: 10.18637/jss.v055.i14

This work is licensed under the licenses
Paper: Creative Commons Attribution 3.0 Unported License
Code: GNU General Public License (at least one of version 2 or version 3) or a GPL-compatible license.