Joshua GarlandSanta Fe Institute Omidyar Fellow


1399 Hyde Park Rd, Santa Fe
Hello. I am a Mathematician Researcher Time Series Analyst Teacher Information Theorist
I am passionate about complexity science
Welcome to my academic profile
Available for consulting and collaborating



In the study of complex adaptive systems, the available data often falls far short of the demands of the theory. In mathematics, for instance, we often make statements like “assume we have an infinite noise-free time series, then…” But in studying complex systems, we often ask questions like “I have 347 noisy observations that I collected over three years in a jungle. What can we learn from this information?” My research aims to develop rigorous models that bridge the gap between theory and observation---data that may be wildly lacking in the eyes of mathematics, but may still contain valuable information about the system. Said differently: when perfect isn’t possible, how can we adapt mathematics to describe the world around us? In studying complicated, ill-sampled, noisy systems, my work focuses on understanding how much information is present in the data, how to extract it, to understand it, and to use it---but not overuse it.

Specifically, I am working toward developing a parsimonious reconstruction theory for nonlinear dynamical systems. In addition, I aim to leverage information mechanics (e.g., production, storage and transmission) to gain insight into important yet imperfect systems, like the climate, traded financial markets and the human heart. My hope is this combination of new mathematical theory, analysis, and application can eventually shed a little more light on universalities like emergence, regime shifts, and phase transitions.

I received my Ph. D. from the University of Colorado, supervised by Elizabeth Bradley, introducing a new paradigm in delay-coordinate reconstruction theory. Prior to that, I earned an M.S. in Applied Mathematics also from the University of Colorado, constructing dynamical models of computer performance and a dual B.S. in Mathematics and Computer Science from Colorado Mesa University.



Duis eu finibus urna. Pellentesque facilisis tellus vel leo accumsan, a tristique est luctus. Morbi quis euismod nulla. Sed eu nibh eros.

Duis eu finibus urna. Pellentesque facilisis tellus vel leo accumsan, a tristique est luctus. Morbi quis euismod nulla. Sed eu nibh eros.

Duis eu finibus urna. Pellentesque facilisis tellus vel leo accumsan, a tristique est luctus. Morbi quis euismod nulla. Sed eu nibh eros.

Duis eu finibus urna. Pellentesque facilisis tellus vel leo accumsan, a tristique est luctus. Morbi quis euismod nulla. Sed eu nibh eros.






(collaboration network)

OCT 2016


International Symposium on Intelligent Data Analysis XV

Stockholm, Sweden

Paleoclimate records are extremely rich sources of information about the past history of the Earth system. We take an information-theoretic approach to analyzing data from the WAIS Divide ice core, the longest continuous and highest-resolution water isotope record yet recovered from Antarctica. We use weighted permutation entropy to calculate the Shannon entropy rate from these isotope measurements, which are proxies for a number of different climate variables, including the temperature at the time of deposition of the corresponding layer of the core. We find that the rate of information production in these measurements reveals issues with analysis instruments, even when those issues leave no visible traces in the raw data. These entropy calculations also allow us to identify a number of intervals in the data that may be of direct relevance to paleoclimate interpretation, and to form new conjectures about what is happening in those intervals—including periods of abrupt climate change.

Conference Proceedings J. Garland, T.R. Jones, E. Bradley R. James J.W.C. White.
OCT 2013


International Symposium on Intelligent Data Analysis XII

London, England

Computers are nonlinear dynamical systems that exhibit complex and sometimes even chaotic behavior. The low-level performance models used in the computer systems community, however, are linear. This paper is an exploration of that disconnect: when linear models are adequate for predicting computer performance and when they are not. Specifically, we build linear and nonlinear models of the processor load of an Intel i7-based computer as it executes a range of different programs. We then use those models to predict the processor loads forward in time and compare those forecasts to the true continuations of the time series. IDA-13 Frontier Prize Recipient

Conference Proceedings J. Garland and E. Bradley
OCT 2011


International Symposium on Intelligent Data Analysis X

Porto, Portugal

Traditional approaches to the design and analysis of computer systems employ linear, stochastic mathematics—techniques that are becoming increasingly inadequate as computer architects push the design envelope. To work effectively with these complex engineered systems, one needs models that correctly capture their dynamics, which are deterministic and highly nonlinear. This is important not only for analysis, but also for design. Even an approximate forecast of the state variables of a running computer could be very useful in tailoring system resources on the fly to the dynamics of a computing application—powering down unused cores, for instance, or adapting cache configuration to memory usage patterns. This paper proposes a novel prediction strategy that uses nonlinear time-series methods to forecast processor load and cache performance, and evaluates its performance on a set of simple C programs running on an Intel Core® Duo.

Conference Proceedings J. Garland and E. Bradley
NOV 2016


Physica D: Nonlinear Phenomena

Computing the state-space topology of a dynamical system from scalar data requires accurate reconstruction of those dynamics and construction of an appropriate simplicial complex from the results. The reconstruction process involves a number of free parameters and the computation of homology for a large number of simplices can be expensive. This paper is a study of how to compute the homology efficiently and effectively without a full (diffeomorphic) reconstruction. Using trajectories from the classic Lorenz system, we reconstruct the dynamics using the method of delays, then build a simplicial complex whose vertices are a small subset of the data: the “witness complex”. Surprisingly, we find that the witness complex correctly resolves the homology of the underlying invariant set from noisy samples of that set even if the reconstruction dimension is well below the thresholds for assuring topological conjugacy between the true and reconstructed dynamics that are specified in the embedding theorems. We conjecture that this is because the requirements for reconstructing homology are less stringent: a homeomorphism is sufficient—as opposed to a diffeomorphism, as is necessary for the full dynamics. We provide preliminary evidence that a homeomorphism, in the form of a delay-coordinate reconstruction map, may exist at a lower dimension than that required to achieve an embedding.

Journal Paper J. Garland, E. Bradley, J.D. Meiss
JUNE 2018


Chaos: An Interdisciplinary Journal of Nonlinear Science

Understanding the mechanics behind the coordinated movement of mobile animal groups (collective motion) provides key insights into their biology and ecology, while also yielding algorithms for bio-inspired technologies and autonomous systems. It is becoming increasingly clear that many mobile animal groups are composed of heterogeneous individuals with differential levels and types of influence over group behaviors. The ability to infer this differential influence, or leadership, is critical to understanding group functioning in these collective animal systems. Due to the broad interpretation of leadership, many different measures and mathematical tools are used to describe and infer "leadership", e.g., position, causality, influence, information flow. But a key question remains: which, if any, of these concepts actually describes leadership? We argue that instead of asserting a single definition or notion of leadership, the complex interaction rules and dynamics typical of a group implies that leadership itself is not merely a binary classification (leader or follower), but rather, a complex combination of many different components. In this paper we develop an anatomy of leadership, identify several principle components and provide a general mathematical framework for discussing leadership. With the intricacies of this taxonomy in mind we present a set of leadership-oriented toy models that should be used as a proving ground for leadership inference methods going forward. We believe this multifaceted approach to leadership will enable a broader understanding of leadership and its inference from data in mobile animal groups and beyond.

Journal Paper J. Garland, A.M Berdahl, J. Sun and E. Bollt
JULY 2017


PLoS One

Persistent atrial fibrillation (AF) can be viewed as disintegrated patterns of information transmission by action potential across the communication network consisting of nodes linked by functional connectivity. To test the hypothesis that ablation of persistent AF is associated with improvement in both local and global connectivity within the communication networks, we analyzed multi-electrode basket catheter electrograms of 22 consecutive patients (63.5 ± 9.7 years, 78% male) during persistent AF before and after the focal impulse and rotor modulation-guided ablation. Eight patients (36%) developed recurrence within 6 months after ablation. We defined communication networks of AF by nodes (cardiac tissue adjacent to each electrode) and edges (mutual information between pairs of nodes). To evaluate patient-specific parameters of communication, thresholds of mutual information were applied to preserve 10% to 30% of the strongest edges. There was no significant difference in network parameters between both atria at baseline. Ablation effectively rewired the communication network of persistent AF to improve the overall connectivity. In addition, successful ablation improved local connectivity by increasing the average clustering coefficient, and also improved global connectivity by decreasing the characteristic path length. As a result, successful ablation improved the efficiency and robustness of the communication network by increasing the small-world index. These changes were not observed in patients with AF recurrence. Furthermore, a significant increase in the small-world index after ablation was associated with synchronization of the rhythm by acute AF termination. In conclusion, successful ablation rewires communication networks during persistent AF, making it more robust, efficient, and easier to synchronize. Quantitative analysis of communication networks provides not only a mechanistic insight that AF may be sustained by spatially localized sources and global connectivity, but also patient-specific metrics that could serve as a valid endpoint for therapeutic interventions.

Journal Paper S. Tao, S. F.Way, J. Garland, J. Chrispin, L. A. Ciu o, M. A. Balouch, S. Nazarian, D. D. Spragg, J. E. Marine, R. D. Berger, H. Calkins and H. Ashikaga
FEB 2016


Physical Review E

Delay-coordinate reconstruction is a proven modeling strategy for building effective forecasts of nonlinear time series. The first step in this process is the estimation of good values for two parameters, the time delay and the embedding dimension. Many heuristics and strategies have been proposed in the literature for estimating these values. Few, if any, of these methods were developed with forecasting in mind, however, and their results are not optimal for that purpose. Even so, these heuristics—intended for other applications—are routinely used when building delay coordinate reconstruction-based forecast models. In this paper, we propose an alternate strategy for choosing optimal parameter values for forecast methods that are based on delay-coordinate reconstructions. The basic calculation involves maximizing the shared information between each delay vector and the future state of the system. We illustrate the effectiveness of this method on several synthetic and experimental systems, showing that this metric can be calculated quickly and reliably from a relatively short time series, and that it provides a direct indication of how well a near-neighbor based forecasting method will work on a given delay reconstruction of that time series. This allows a practitioner to choose reconstruction parameters that avoid any pathologies, regardless of the underlying mechanism, and maximize the predictive information contained in the reconstruction.

Journal Paper J. Garland, R.G. James, E. Bradley
NOV 2015


Chaos: An Interdisciplinary Journal of Nonlinear Science

Prediction models that capture and use the structure of state-space dynamics can be very effective. In practice, however, one rarely has access to full information about that structure, and accurate reconstruction of the dynamics from scalar time-series data—e.g., via delay-coordinate embedding—can be a real challenge. In this paper, we show that forecast models that employ incomplete reconstructions of the dynamics—i.e., models that are not necessarily true embeddings—can produce surprisingly accurate predictions of the state of a dynamical system. In particular, we demonstrate the effectiveness of a simple near-neighbor forecast technique that works with a two-dimensional time-delay reconstruction of both low- and high-dimensional dynamical systems. Even though correctness of the topology may not be guaranteed for incomplete reconstructions like this, the dynamical structure that they do capture allows for accurate predictions—in many cases, even more accurate than predictions generated using a traditional embedding. This could be very useful in the context of real-time forecasting, where the human effort required to produce a correct delay-coordinate embedding is prohibitive.

Journal Paper J. Garland and E. Bradley
AUG 2015


PLoS One

In online social media networks, individuals often have hundreds or even thousands of connections, which link these users not only to friends, associates, and colleagues, but also to news outlets, celebrities, and organizations. In these complex social networks, a ‘community’ as studied in the social network literature, can have very different meaning depending on the property of the network under study. Taking into account the multifaceted nature of these networks, we claim that community detection in online social networks should also be multifaceted in order to capture all of the different and valuable viewpoints of ‘community.’ In this paper we focus on three types of communities beyond follower-based structural communities: activity-based, topic-based, and interaction-based. We analyze a Twitter dataset using three different weightings of the structural network meant to highlight these three community types, and then infer the communities associated with these weightings. We show that interesting insights can be obtained about the complex community structure present in social networks by studying when and how these four community types give rise to similar as well as completely distinct community structure.

Journal Paper D. Darmon, E. Omodei, J. Garland
MAR 2015


PLoS One

Electrical communication between cardiomyocytes can be perturbed during arrhythmia, but these perturbations are not captured by conventional electrocardiographic metrics. We developed a theoretical framework to quantify electrical communication using information theory metrics in two-dimensional cell lattice models of cardiac excitation propagation. The time series generated by each cell was coarse-grained to 1 when excited or 0 when resting. The Shannon entropy for each cell was calculated from the time series during four clinically important heart rhythms: normal heartbeat, anatomical reentry, spiral reentry and multiple reentry. We also used mutual information to perform spatial profiling of communication during these cardiac arrhythmias. We found that information sharing between cells was spatially heterogeneous. In addition, cardiac arrhythmia significantly impacted information sharing within the heart. Entropy localized the path of the drifting core of spiral reentry, which could be an optimal target of therapeutic ablation. We conclude that information theory metrics can quantitatively assess electrical communication among cardiomyocytes. The traditional concept of the heart as a functional syncytium sharing electrical information cannot predict altered entropy and information sharing during complex arrhythmia. Information theory metrics may find clinical application in the identification of rhythm-specific treatments which are currently unmet by traditional electrocardiographic techniques.

Journal Paper H. Ashikaga, J. Aguilar-Rodrguez; S. Gorsky, E. Lusczek, F.M.D. Marquitti, B. Thompson, D. Wu, J. Garland
APR 2015


PLoS One

In space mission trajectory planning in dynamic environments, such as at asteroids, scenarios leading to failure must be discovered. Given an initial state of a spacecraft about an asteroid, failure can be simply quantified as impact of the vehicle with the asteroid or escape of the vehicle from the asteroid. For mission planning and execution purposes it is necessary to perform maneuvers to avoid such outcomes, however the overall set of possible maneuvers that either avoid impact or escape can be a complex set that cannot be analytically characterized. This paper introduces a reachable set explorer (RSE) for exploring the reachable set of a spacecraft-the set of trajectories under a range of ΔV expenditures. This approach is applied to the circular restricted 3-body problem (CR3BP) where a brute-force approach is intractable. RSE focuses on the boundaries between impact, escape, and in-system regions, known as the end-result regions.

Journal Paper E. Komendera, J. Garland, E. Bradley, D. Scheeres
NOV 2014


Physical Review E

This paper provides insight into when, why, and how forecast strategies fail when they are applied to complicated time series. We conjecture that the inherent complexity of real-world time-series data, which results from the dimension, nonlinearity, and nonstationarity of the generating process, as well as from measurement issues such as noise, aggregation, and finite data length, is both empirically quantifiable and directly correlated with predictability. In particular, we argue that redundancy is an effective way to measure complexity and predictive structure in an experimental time series and that weighted permutation entropy is an effective way to estimate that redundancy. To validate these conjectures, we study 120 different time-series data sets. For each time series, we construct predictions using a wide variety of forecast models, then compare the accuracy of the predictions with the permutation entropy of that time series. We use the results to develop a model-free heuristic that can help practitioners recognize when a particular prediction method is not well matched to the task at hand: that is, when the time series has more predictive structure than that method can capture and exploit.

Journal Paper J. Garland, R.G. James, E. Bradley
6 DEC 2007



Delay-coordinate embedding is a powerful, time-tested mathematical framework for reconstructing the dynamics of a system from a series of scalar observations. Most of the associated theory and heuristics are overly stringent for real-world data, however, and real-time use is out of the question due to the expert human intuition needed to use these heuristics correctly. The approach outlined in this thesis represents a paradigm shift away from that traditional approach. I argue that perfect reconstructions are not only unnecessary for the purposes of delay-coordinate based forecasting, but that they can often be less effective than reduced-order versions of those same models. I demonstrate this using a range of low- and high-dimensional dynamical systems, showing that forecast models that employ imperfect reconstructions of the dynamics---i.e., models that are not necessarily true embeddings---can produce surprisingly accurate predictions of the future state of these systems. I develop a theoretical framework for understanding why this is so. This framework, which combines information theory and computational topology, also allows one to quantify the amount of predictive structure in a given time series, and even to choose which forecast method will be the most effective for those data.

Theses Selected J. Garland
MAR 2012


Chaos: An Interdisciplinary Journal of Nonlinear Science

We investigate the use of iterated function system (IFS) models for data analysis. An IFS is a discrete-time dynamical system in which each time step corresponds to the application of one of the finite collection of maps. The maps, which represent distinct dynamical regimes, may be selected deterministically or stochastically. Given a time series from an IFS, our algorithm detects the sequence of regime switches under the assumption that each map is continuous. This method is tested on a simple example and an experimental computer performance data set. This methodology has a wide range of potential uses: from change-point detection in time-series data to the field of digital communications.

Journal Paper Z. Alexander, J.D. Meiss, E. Bradley, J. Garland
Jan 2019


Ecological Monographs

Successfully predicting the future states of systems that are complex, stochastic and potentially chaotic is a major challenge. Model forecasting error (FE) is the usual measure of success; however model predictions provide no insights into the potential for improvement. In short, the realized predictability of a specific model is uninformative about whether the system is inherently predictable or whether the chosen model is a poor match for the system and our observations thereof. Ideally, model proficiency would be judged with respect to the systems’ intrinsic predictability – the highest achievable predictability given the degree to which system dynamics are the result of deterministic v. stochastic processes. Intrinsic predictability may be quantified with permutation entropy (PE), a model-free, information-theoretic measure of the complexity of a time series. By means of simulations we show that a correlation exists between estimated PE and FE and show how stochasticity, process error, and chaotic dynamics affect the relationship. This relationship is verified for a dataset of 461 empirical ecological time series. We show how deviations from the expected PE-FE relationship are related to covariates of data quality and the nonlinearity of ecological dynamics. These results demonstrate a theoretically-grounded basis for a model-free evaluation of a system’s intrinsic predictability. Identifying the gap between the intrinsic and realized predictability of time series will enable researchers to understand whether forecasting proficiency is limited by the quality and quantity of their data or the ability of the chosen forecasting model to explain the data. Intrinsic predictability also provides a model-free baseline of forecasting proficiency against which modeling efforts can be evaluated.

Journal Paper F. Pennekamp, A. C. Iles, J. Garland, U. Brose, G. Brennan, U Gaedke, U. Jacob, P. Kratina, B. Matthews, S. Munch, M. Novak, G. Palamara, B. Rall, B. Rosenbaum, A. Tabi, C. Ward, R. Williams, H. Ye, and O. Petchey
JUNE 2018


Proceedings of the Royal Society B

Animal social groups are complex systems that are likely to exhibit tipping points—which are defined as drastic shifts in the dynamics of systems that arise from small changes in environmental conditions—yet this concept has not been carefully applied to these systems. Here we summarize the concepts behind tipping points and describe instances in which they are likely to occur in animal societies. We also offer ways in which the study of social tipping points can open up new lines of inquiry in behavioral ecology and generate novel questions, methods, and approaches in animal behavior and other fields, including community and ecosystem ecology. While some behaviors of living systems are hard to predict, we argue that probing tipping points across animal societies and across tiers of biological organization—populations, communities, ecosystems—may help to reveal principles that transcend traditional disciplinary boundaries.

Journal Paper J. N. Pruitt, A. Berdahl, C. Riehl, N. Pinter-Wollman, H. V. Moeller, E. G. Pringle, L. M. Aplin, E. J. H. Robinson, J. Grilli, P. Yeh, V. M. Savage, M. H. Price, J. Garland, I. C. Gilby, G. N. Doering, E. Hobson
JUNE 2018


Submitted to Ecology

Tipping points are defined as drastic shifts in the state of systems resulting from small changes in environmental conditions. We argue that predicting tipping points in nature will require leaving the convenience of simplified laboratory systems to consider cases where multiple complex systems interact both within and across levels of biological organization. We refer to the dynamics of interacting tipping points as ‘coupled tipping points’. Understanding interactions between the states of complex systems is likely essential for predicting how they will respond to changing environmental conditions. However, the prevalence and importance of coupled system dynamics in nature is not currently well resolved. Here we use animal social groups and their interactions with higher and lower organizational levels to help elucidate the potential importance of coupled tipping points, and how they can be explored. We outline i) describe mathematically how the dynamics of interacting complex systems can be coupled together, ii) levels of organization in which tipping points are known to occur, and iii) axes of variation that can be used to compare and contrast the properties of coupled tipping points. We conclude by outlining some of the challenges facing investigations into these topics and some potential solutions. By accounting for interactions between complex systems, the study of tipping points promises to become a more integrative and predictive science.

Preprint J. N. Pruitt*, J. Garland*, H. V. Moeller, A. Berdahl, N. Pinter-Wollman, P. Yeh, V. Savage, K. Demes, A. Little, D. Fisher, M. H. Price, D. Feldman, E. Robinson, E. Hobson
JUNE 2018


Submitted to PLoS One

Paleoclimate records are extremely rich sources of information about the past history of the Earth system. Information theory, the branch of mathematics capable of quantifying the degree to which the present is informed by the past, provides a new means for studying these records. Here, we demonstrate that estimates of the Shannon entropy rate of the water-isotope data from the West Antarctica Ice Sheet (WAIS) Divide ice core, calculated using weighted permutation entropy (WPE), can bring out valuable new information from this record. We find that WPE correlates with accumulation, reveals possible signatures of geothermal heating at the base of the core, and clearly brings out laboratory and data-processing effects that are difficult to see in the raw data. For example, the signatures of Dansgaard-Oeschger events in the information record are small, suggesting that these abrupt warming events may not represent significant changes in the climate system dynamics. While the potential power of information theory in paleoclimatology problems is significant, the associated methods require careful handling and well-dated, high-resolution data. The WAIS Divide ice core is the first such record that can support this kind of analysis. As more high-resolution records become available, information theory will likely become a common forensic tool in climate science.

Preprint J. Garland, T.R. Jones, E. Bradley, M. Neuder, J.W.C. White
DEC 2018



Permutation entropy techniques can be useful for identifying anomalies in paleoclimate data records, including noise, outliers, and post-processing issues. We demonstrate this using weighted and unweighted permutation entropy with water-isotope records containing data from a deep polar ice core. In one region of these isotope records, our previous calculations (See Garland et al. 2018) revealed an abrupt change in the complexity of the traces: specifically, in the amount of new information that appeared at every time step. We conjectured that this effect was due to noise introduced by an older laboratory instrument. In this paper, we validate that conjecture by reanalyzing a section of the ice core using a more advanced version of the laboratory instrument. The anomalous noise levels are absent from the permutation entropy traces of the new data. In other sections of the core, we show that permutation entropy techniques can be used to identify anomalies in the data that are not associated with climatic or glaciological processes, but rather effects occurring during field work, laboratory analysis, or data post-processing. These examples make it clear that permutation entropy is a useful forensic tool for identifying sections of data that require targeted reanalysis—and can even be useful for guiding that analysis.

Journal Paper J. Garland*, T. R. Jones*, M. Neuder, V. Morris, J. W. C. White, E. Bradley
DEC 2018



Explaining how and why some species evolved to have more complex social structures than others has been a long-term goal for many researchers in animal behavior because it would provide important insight into the links between evolution and ecology, sociality, and cognition. However, despite long-standing interest, the evolution of social complexity is still poorly understood. This may be due in part to a focus more on the feasibility of quantifying aspects of sociality and less consideration for what features of complexity these measures can detect. Any given approach to studying complexity can tell us some things about animal sociality, but may miss others, so it is critical to decide first how to approach conceptualizing complexity before jumping in to quantifying it. Here, we highlight four fundamental concepts that are commonly used in the field of complex systems: (1) micro and macro scales of organization, (2) compression and coarse-graining, (3) emergent macro-level patterns, and (4) downward causation from macro to micro scales. These are particularly applicable to animal social systems, but are not often explicitly addressed. We summarize these concepts and then show how they can provide a rigorous foundation for conceptualizing social complexity. Using these concepts, we re-examine several existing animal social complexity approaches and show how each relates to these fundamental concepts. Ultimately, researchers need to critically evaluate any measure of animal social complexity in order to balance the biological relevance of the aspect of sociality they are quantifying with the feasibility of obtaining enough data.

Preprint E. Hobson, V. Ferdinand, A. Kolchinsky,J. Garland


Cobweb Diagram

Cobweb Diagram

Logistic Map: Cobweb Plots and the Time Domain

For a walk through of this app, as well as exercises please see this worksheet.

The “Initial Condition (x0)” text field changes the point from which the trajectory begins. This input is only well defined between 0 and 1. The “Parameter (r)” text field changes the logistic map’s r parameter. Alternatively, you can also change r by clicking your mouse on the cobweb plot. The top of the parabola will be moved to the point you clicked, effectively adjusting the r parameter. The r parameter currently selected will appear in the cobweb plot’s legend, as well as in the “Parameter (r)” text field. The “Number of Initial Iterates” field changes how many iterates of the logistic map are plotted initially, equivalently, how long the initial trajectory is.

Cobweb and Time Series Options Number of Initial Iterates: Initial Condition (x0): Parameter (r):

Cobweb and Time Series Control Panel
Bifurcation Diagram

Bifurcation Diagram

Bifurcation Plot Options and Control

, iterations (transients), iterations.

(vertical axis)

(horizontal axis)

Mouse: (r: 3.8797996661101837, x: 0.7736389684813754)

Exploration of Map Dynamics with Bifurcation Diagrams

For a walk through of this app, as well as exercises please see this worksheet.

Sorry, your browser does not support the Canvas element.