# Getting and Setting Data

A lot of functional programming revolves around functions. Getting and setting data is a crucial part in programming. It's also how you easily violate pure function rules by mutating data or creating null pointer exceptions.

Herein we'll cover the normal ways of getting & setting data, and why that isn't pure. We'll then cover the various safe and terse ways we can get data purely. Finally, we'll cover some basic getting and setting of data using lenses via Ramda and Focused, and Lenses for Python.

# I Use Types, Thus Lenses Are An Anti-Pattern

If you use types, you may think lenses are pointless. Why would you get null pointers if everything is typed correctly? I encourage you to learn about lenses for 3 reasons anyway (beyond their advanced usages like in Haskell which are beyond the scope of this book).

# Limitations on Types

I've you've used a type language for awhile, you know you can still get runtime exceptions around null pointers. Sadly, there are limits of various type systems such as getting items in an Array or ensuring a Number is within a particular range. When parsing data from unknown sources, types can break down so some typed languages will create super strict parser like Elm's parser or Elm's JSON decoders. This ensures no untyped data can get in and is why lenses aren't used in Elm. Typed languages have some ways of ensuring these problems can't happen, but some languages still have limits of "prove I won't get runtime exceptions, ever!". Languages like Elm and Purr can make this claim. However, for larger data sets, adding parsers can be a lot of work, and if the data changes in any way, you have to redo/modify a lot of what you did. It's not fun nor fast.

# You Don't Always Control Your Data

Parsers won't save you when you don't know the exact structure of the data. For example, when doing audit logging with Kafka. You'll take events from various places which are JSON messages created by various teams over various points in their project. These are from thousands of different applications created over the years which may/may not the exact JSON format you're expecting with all fields there, intact, with reasonable values and expected types. Then you have to process thousands of these a second, schema or not. The same challenge exists with data science around data you don't know and have to clean.

Lenses have the advantage of being able to deal with unknown data safely, in an immutable way. Types help, no doubt. However, if you're using dynamic languages, or large data sets, possibly ones that change or don't have a constant structure, lenses can help keep purity, with less code than what you'd get using list comprehensions.