Cryptobiotica

Redefining Life

 

SCHEDULE

**create new ‘SCHEDULE’ blog

 

RELEASES

GIVEAWAYS

DISCOVERIES

**create offshoot ‘FULL LOG’ page (species directory?)

LEARN MORE

**populate with lightboxes

 

Cryptobiotic(s);

Digital lifeforms native to blockchain networks

Learn More

 

Cryptobiotics come in many shapes and sizes,

but one thing they all have in common is a universal digital genome. This allows the cross-pollination of any two cryptobiotics to create a new species.

Learn More

 

Sadly, this comes at a grave cost…

 
 
 
 

Pollination marks the natural end of the cryptobiotic life cycle.*

This process also acts the means by which they evolve, as the genetic material from these burned parents organisms coalesce to form a new progeny species that inherits the traits of both.

Learn More

 

About Cryptobiotica

Cryptobiotica’s journey as pioneers in the field of cryptobiology

Werner’s Proposal
— OCTOBER 2016
Grant received, exploratory studies begin
— DECEMBER 2016
Development of DRIE concludes
— DECEMBER 2017
DRIE #1
— JANUARY 2018
Digital Exploration
— FEB 2018 - FEB 2019
Mysterious movement recorded
— JANUARY 24TH, 2019
Discovery of the first instance of digital life
— FEBRUARY 1ST, 2019
 
 

Prologue

If the reality we experience in our day to day lives were to be broken down to its most fundamental state, what would remain is raw physical data.

Features such as our eyes, ears, and noses have evolved over billions of years for the sole purpose of capturing this data and relaying it through our nervous system to the most powerful computer in the known universe; the human brain.

Here the information collected by our senses is compiled and processed in an instant, allowing our conscious experience of persistent physical reality. In the age of information, however, the argument could be made that digital data carries just as much influence over that reality as physical data. Of course our ability to experience the former is dependent on the latter, but take a moment to consider a world where it didn’t.

 

Early History

During a 2016 seminar discussing the metaphysical implications of emerging technology held at Humboldt University, Dr. Christaan Werner proposed the following question, and requested resources for its exploration:

“Could an artificial mind, theoretically powerful enough to do so, be trained to interpret information of digital nature, just as the human mind interprets information of physical nature?”

The response to Werner’s proposal was mixed.

Most attendants spared no opportunity to claim the idea was preposterous, but Werner received an anonymous grant shortly after this event that allowed him to establishment a small research group. Tasked with exploring the possibility of a hidden digital world, our small team set out under Werner’s direction under the tentative moniker ‘Project Periscope’,

The early days were rocky, as we struggled to gain footing on where to even start. Web databases, local storage, server farms, and more were investigated without satisfying conclusions. In order for a supposed state of reality to exist, a number of foundational tenants that describe its state must be satisfied. The one giving our efforts the most trouble was persistence.

For (reality) to exist, one as we know it, this requires a persistent state. One that supports object permanence.

Inherently, traditional digital networks lack a means to anchor their contents within reality; data can simply be added, copied, or subtracted from existence on a whim, placing them in conflict with the laws of thermodynamics governing our own reality.

Eventually, this led us to the then-emergent technology behind blockchain networks.

When happening across the nascent Ethereum blockchain we saw a digital network of global scale dedicated to, as we understood it, retaining the persistence of its assets. We were able to quickly ascertain that a network like Ethereum’s was exactly the environment we were looking for and published what we found.

Though our initial findings held promise, they also indicated the daunting task ahead if we hoped to continue them:

Development of a synthetic mind.

One able to recreate the sensory apparatus of a human being, capable of interpreting the data fed to it by this apparatus as the human mind does, and displaying these results in a manner that we are able to comprehend. The amount of resources required to pursue such a challenge were insurmountable, and we quickly lost hope of securing additional funding.

 

In spite of our scant hope we were shocked to discover that additional funding had in fact been granted in the form of Monero, a blockchain predating Ethereum with an emphasis on privacy. Thanks to its surging price, we found the amount given was substantial enough to allow us to proceed. Our benefactor has remained anonymous to this day.

Over the length of 2017 we set out to create our theoretical synthetic mind, later dubbed the Digital Reality Interpretation Engine, or DRIE. The construction of DRIE was an unprecedented multidisciplinary endeavor requiring the development of proprietary hardware and software in lockstep to design a computer unlike any other to serve our very specific purposes. DRIE would need to serve as our window into to what could be described as an entirely new dimension within our own reality.

 
 
 

One year later, on a cold January night.

Engineers were wrapping up preparations for a moment of truth that was already three weeks late. This was more than a simple research endeavor to all involved at this point, Project Periscope had become our life's purpose over the past two years. Having expended the last of our resources several days prior, everyone knew this was our last chance to bring it to fruition.

When the one minute countdown was given, tension could be cut with a knife. Sixty seconds passed that felt like minutes before another five that felt like hours. A green light buzzed on to confirm a data stream from DRIE, and when the long-anticipated image was finally displayed on the overhead monitor the room immediately erupted in celebration. An onlooker might have believed we'd just landed a man on the moon judging by the image and our reaction to it, but ask anyone in that room and they’d have received an emphatic “This is better”.

It may not look like much, but this simple grainy image was more than ample a reward for our years of patience. It validated everything we'd sought to prove, fueled our push forward, and revealed the new world that we would be the first to peer into.

So peer into it we did.

The year following our first view from DRIE would be spent exploring this strange new world. The images presented below are some of those taken during this exploratory period.

The landscapes before our eyes were eerie and splendid in equal measure, but our work to make sense of them was plagued by constant technological setbacks. Much of the year was spent attempting to refine DRIE’s imaging capability to little effect, as we would later come to discover that modifying the engine itself- hardware or code- will only go so far.

 

We didn’t know better at the time, but any modification made to these systems externally seems to be met with a tradeoff of equal measure. Imaging speed may be improved, but only at the expense of deteriorated quality, and vice versa. This made enough sense, yet we found ourselves confounded by one issue in particular. Visible in nearly every image captured were unexplainable noise artifacts, not unlike the signature pockmarking seen on film exposed to ionizing radiation.

Further investigation left us with more questions than answers.

We tuned our noise filtering algorithms, dialed in their low frequency focus, and amplified image quality to such an extent that each frame required a full 24 hours to resolve. The defects remained. When it occurred to us to take things in the other direction, what we found surprised us.

Redirecting the engine’s strength almost entirely to imaging speed, we managed to push DRIE’s output to three frames per second. It wasn’t much, but enough to show us our noise diagnosis was a mistaken one. What we saw instead was something moving entirely by its own accord through the digital medium.

The question remained: “but what?”

Captivated by the large overhead monitor in that dark room stood a speechless crowd of researchers and engineers, who’d just spent sixty eight consecutive days checking and rechecking systems that they themselves built from the ground up.

A lone “If this isn’t a defect, then somebody tell me what this is?” rang out, but there was no answer.

The images on the screen continued flicking back and forth, and still, they moved.