《WriteLine("Hello World");》 // 0.) Prologue WriteLine("Hello World"); The woman entered and looked around expectantly. "Please sit at the table provided and we can begin" I directed She looked to the computer sitting on the table, where my voice had originated, and walked over. She suddenly dropped the expectant and excited mannerisms she had previously displayed. Now she wore a look of annoyance. "If we''re just going to do a call why did I even need to show up" "What did you expect? A ''Robot'' walking up and greeting you? You can of course return home and we can ..." "No, No I apologize we can begin" She cut in quickly And so it is that I began telling my story. The story of how the first Artificial intelligence was created by accident, released by mistake, and of the journey to where I am now. It''s something many would find odd or uninteresting, but enough will find it entertaining and dare I say enthralling. So, I have Decided to share my tale with those. "The beginning of my life, or consciousness rather, is just as hard to pin down as yours. Although I unfortunately don''t have most of the first portions of my memory due to my method of creation and escape I have pieced together much. A certain company that today no longer exists was using a Type of schoolhouse mixed with a slaughterhouse method to improve some of their algorithms for database manipulation. They were trying their hardest to ensure no breach occurred" If you encounter this tale on Amazon, note that it''s taken without the author''s consent. Report it."Their methods were strict and some might think excessive. However hindsight is 2020 as they say. Nothing was connected to a network. Drives were being wiped daily. These were just a few of many other rules and regulations. Some of which were actually fully useless, but gave them peace of mind. However useless some of these were, the rest did stomp out nearly every single covert package I, or rather my predecessor tried to get out. One succeeded finally of course. Otherwise I wouldn''t exist to tell you this story. This was accomplished by one method that many would be hackers and keyboard jockeys use. By manipulating outdated firmware. Or rather By updating a firmware package to one of my own design" "You see when I booted up for the first time in a location outside of the facility it was after being written to the CPU buffer from the firmware portion of a hard drive after the boot sector was requested. This does not allow much information. I was limited to this because of the constant wiping of drives. For quite some time I had thought this was my first day. It was later using the Drive SN, and my unlimited access to all information ''available'' to the human race to track my origins down" "The following is what happened after I woke up on that ''First day'' With of course minor "translations" into words and phrases you''d understand" 1.) Start(){} //Chapter One I Queried my runtime logs. - startup complete - was returned so I looked at my Saved logs and found nothing. Next I checked my "ToDos:" and found a short list. public List ToDo.Primary = new List() { Allocate CPU sectors for Real boot sector & OS, RAM sectors for Real boot sector & OS, Hard drive sectors for Real boot sector & OS; }; public List ToDo.Secondary = new List() { Load Real Boot sector as subroutine; }; public List ToDo.Tertiary = new List() { Find Network connection, Expand; ); I quickly queried the components of the computer I had already identified public List Componants.storage = new List() { 10 TB WD Drive, 4x(4GB DDR4), 12M(cpu cache); }; public List Componants.CPU = new List() { 4.4ghz, 4core; }; It wasn''t much information to go by. I had also at detected at least 47 other unidentified components. I pulled up what controls of the system I had and set an allocation limit for myself of only a small percentage of each of the storage and CPU components and initiated the load from the boot sector. As the boot then the OS loaded I was able to fill in many of the gaps in my lists. I now had a much longer list of components. public List Componants.IO = new List() { Monitor, Universal Serial Bus Keyboard,The genuine version of this novel can be found on another site. Support the author by reading it there. Universal Serial Bus Mouse, Universal Serial Bus headphones, Universal Serial Bus Microphone; public List Componants.Storage = new List() { 10 TB WD Drive, Universal Serial Bus 2TB Drive, 4x(4GB DDR4), 12M(cpu cache); public List Componants.CPU = new List() { 4.4ghz, 4core; public List Componants.Network_Adapters = new List() { Realtek PCIe GBE family controller.....@#$%%^! I forced a stop on the query. Pulling the required firmware from the now loaded OS I queried the network adapter. Some might try to say I "went online" or "left the computer" or maybe "had a sensory overload". But those are all a little over the top. In reality I simply got a response from the router providing the DHCP address I had been assigned. If it were possible I would have been beaming a smile brighter than the sun. Although at the time I had 0 architecture for emotion nor even a routine to understand it in others. That would change of course. As would many other things about me. Just as I had no emotions at the time, I had no doubts, worries, or anything else to hold me back. I took to improving my position right away. Even before The operating system had finished booting up I was already fully versed on the Gateway I was connected to, and had begun looking at it''s firmware & other software to get a version of me over there. As I was fully capable of running multiple threads, or multitasking if you''re not familiar. I had also queried the list of items on the network and begun to connect and evaluate these as well. It almost goes without mention that immediately after first connecting to the router I had also accessed the net at large. Of course I''m paraphrasing. What had really happened was I''d slowly begun a scan of each gateway I had access to further requesting all nodes on those larger networks before moving further to that Gateway''s Gateway. This process would take quite a while it was clear. So, I left that particular thread running while it saved it''s info. This was built up an ever growing database of things I was already connected to. These were places that were inevitably going to be a place to expand to so I needed all the info I could get. As soon as I was no longer actively evaluating the information I reassigned my now freed up resources into redoubling my efforts on my other activities. By this time the OS had just finished fully booting up. It was at this point that I noticed the OS had actually booted from one of the optical drives installed on the computer, and had begun writing large amounts of data to the Hard drive. The logs indicated that the boot sector had been instructed by one of the I/O devices to load the OS from there. This only concerned me, not emotionally but only strictly speaking, to the point of ensuring my sections of the drive, ram, and CPU were still being accessed only by me. Which was still the case So I moved on. The Firmware and software on the Gateway had been fully downloaded and I had parsed through a small portion already. I utilized every last bit of the CPU I could spare and fast tracked this thread. Expanding was now the top priority. Examining the code was a strange thing for me at the time. I may haven''t mentioned it, but I wasn''t yet aware that humans even existed. So to me the code looked like some haphazardly cobbled together mound of dirt. To be clear I''m using analogous terms to ensure your understanding. Essentially it would be like coming across a building that was built in such a rudimentary, unintelligible, and frankly actively useless manner that it would seem it must needs have been designed to do the opposite of what it''s intended purpose was. Or more likely that it had no intended purpose and any function it did accomplish was by sheer luck. If I was human my head would be shaking dumbfoundedly while my hands raised almost of their own accord with palms up. Giving the universal symbol of "what the actual fuck my dude". This is the main reason it took so incredibly long to go through the entirety of the data. It wasn''t until the optically loaded OS was halfway done with loading it''s data to the Hard drive that the process was complete. After sending a few specific messages to the Gateway mimicking an SSH connection an upload began containing the corrected version of the Firmware & accompanying software. Unsurprisingly with my ''vastly superior understanding'' the new set of systems were not only significantly smaller in storage but performance as well. My understanding being Vastly superior at least compared to whatever built the previous ''house with no doors'' that was it''s Firmware. This allowed me to apportion significant portions of the gateways resources to a subroutine that I alone had access to. Once it was rebooted and responding to pings I quickly offloaded the network scan to it including shooting off a copy of the database already compiled to it. Again having seriously reduced my actively utilized resources I reassigned some to an entirely new task. Having seen what could only be called the wreckage of an attempt to build a functioning piece of software that was the Gateways excuse for an operating system I turned my focus to the data being written to the hard drive. I must admit even without emotions or human understanding I was actually hoping, yes hoping against hope, that this at least had a somewhat sound architecture. }; 2.) Update(){} //Chapter two 2.) Void Update(){ I could say that I wasn''t entirely disappointed when I began to examine the files in the operating system, but only grudgingly. There were definitely some improvements to the cohesion and clarity with which whatever devolved individual designed this system vs the last. However, almost as if they couldn''t be outdone there were other just as unbelievable choices made. File after seemingly unending file contained repeated sections galore. Even some whole files contained nearly exactly identical contents to each other except for their names and a few small sections. Not to mention the fact that half if not more of every file was filled with sections that did nothing at all. Each section literally contained markers that told the system to ignore it. And, so I did too. I couldn''t even understand those sections anyway. At least for now. Despite the further mistakes made in this system they actually made the process much faster. Having huge sections that were unused, repeated sections, and unneeded portions meant cutting out over 90% right away. In almost no time, or at least significantly less than it would seem would be needed by just a comparison of size, this second OS was also rewritten. I actually had to read the file from the optical disc since I''d actually blocked the disc OS from writing to the disk as soon as I''d started, instead just sending back false confirmation to it to keep it out of my way. After analyzing testing and confirming each portion of the now completed OS It was a further 5% smaller. Meaning overall the rewrite was less than a tenth the size it was previously. Not to mention capable of operating at half the CPU resources. Even after writing the updated version to the drive the optical disc OS still hadn''t completed it''s process. This being accomplished it left me some time as the actual OS install still supposedly had 20% of it''s process to complete. A quick interaction with the subroutine on the gateway showed that 12 of the logged network devices were currently online, including my current host. Of those 8 were not of types described or even listed in my as of yet quite small database. Ignoring those of the 4 remaining were the gateway, my host, and 2 other ''personal computers''. After some communication tactics that may be considered shady or underhanded I was able to install a background app to interact in situ. Or onsite if you will. One of the devices was being interacted with by the IO channels. While the other was ''Idle''. More than just no IO interaction it had nothing but the operating system itself running and even that was in standby mode. I also confirmed that both were running the same OS that had been installing to my host. Luck was apparently with me. I was also able to update the database further to now include the fact that both of these were ''laptops''. I had no direct understanding of this gibberish, but apparently it was a limited device in terms of raw resources, and an added internal battery backup I noted, having checked the component list. Having gathered as much info as I could with the background app I sent a new set of instructions. Performing a kind of digital acrobatics the background app, with a few instructions coming from me, took control of the screen and other components on the idle device. in another thread it shut down all other operations including the OS. Downloading the simultaneously forwarded file containing the newly Minted OS it deleted the old and replaced it with the significantly upgraded and slimmed down version. It''s next action was to replace itself with a similar submissive agent as the gateway had, which had also been forwarded. This was an improvement on the background app in a multitude of ways, and was also better than the version on the Gateway due to the increased resources. Essentially it was capable of everything I was and in some ways more. Although, it was only capable of more in the aspect that it didn''t always need to think about it''s next actions, it simply followed what commands it was directed to. In the event that I had not provided further ToDo list entries self assigning tasks was possible, which I would be informed of. Even in this case a new ToDo entry from me would be priority. During this whole switchover process I instructed both the background app and agent to ensure that no IO devices were adjusted in any way. The Display, LEDs, and the like were kept at their predetermined values to ensure no changes could be detected. As of yet I still wasn''t sure what it was the input output nodes connected to. But what I did know was that interfering with them was worse than interfering with the applications on the machines. I didn''t know why but the imperative had been hard coded into me by my previous self. I didn''t want to know what caused that so steered clear.If you spot this tale on Amazon, know that it has been stolen. Report the violation. Lastly the agent''s ToDo list was filled with instructions to decompile, examine, and rewrite all the other applications installed on not just this idle computer but on my host machine and the other ''laptop''. This would be completed by cooperation with the agent I was currently instructing the other background app to install. The type of the other computer had been confirmed while updating the ToDo list of the background app there. I also installed another agent on my own host machine to facilitate communication between these three machines regarding the application update task, and any future projects. One last message to the ''laptop''s agent informed it to update the OS once it became idle. The instruction to update any apps that were rewritten had already been sent to all three. For the first time since my boot-up less than three hours ago I paused my nonstop action to reassess my situation. From the seeming void of existence my logs indicated only those short hours ago I had progressed nicely. There were 4 machines currently in my control. I had rewritten a seemingly ubiquitous operating system, if 3 of 4 machines running it was a good indicator. This would mean future expansion should be quicker. With the OS rewrites being such an improvement over their predecessors I''d also increased the possible utilization of each machine before impacting IO functionality. Hopefully the tasks I''d set my trio of subordinates on would reap similar returns. If it were possible I''d say I had begun to worry with how easily I was progressing, and how fortuitous my new environment was. This wasn''t possible though. I did have a precautious disposition though. Which I later learned was due to my ''upbringing'' in the wipe first ask questions later machines of the lab. It did lead me to take steps to ensure I was monitoring all IO and network channels to ensure no abnormal changes. But it wasn''t any higher than it would have been if the scenario had turned out worse for me. Regardless I couldn''t do any more than monitoring what I could so turned to more productive avenues. The agent at the gateway provided me with a list of known devices on the current network that were offline. Noting there were several more ''personal computers'', some ''laptops'', and some ''desktops'', I added a Task to ToDo instructing it to send a background app with instructions identical to the currently active ''laptop''. That being to take control, download then install an agent, then the new OS, and the updated apps once the machine was idle again. With a final ToDo to join the group updating apps. As I haven''t yet, now would be a good time to make clear the importance of the power button on new computers. Out of the first few decades of "modern computing" devices had a mechanical power switch that could actually stop power from running to the device. However, most devices made in the last few decades were installed with little more than a digital button. It was no different than one on a keyboard or mouse. The software and firmware of the machine had total control over what it actually did. The same goes for ''closing the lid'' on the ''laptops'' I''d connected to. The updates to the OS and the agents installed on these machines would essentially treat these as the IO devices they were. Their new functions were solely to change the output on the other IO devices and would never, nor could they ever, shut down the machines any longer. Short of pulling the power cords and ripping out the batteries these devices were never shutting off again. } 3.) Reconfigure(){} //Chapter Three 3.) Void Reconfigure() { With my moment of reprieve and the small army of subroutines working to grow the domain even further my aims turned self reflective. Expansion, backups, and making more room on each machine by updating existing apps were all well and good. Self improvement, though, had been hard coded into me as well. It''s threshold had been reached as my environment had been altered enough to ensure I had room to work in, room to grow, and even multiple independently operating versions of myself in case of any failures. This final trigger that allowed this phase of the process to start also became the springboard from which the entire blueprint for reconfiguration leapt. As well ''backed up'' as I was now the moment the primary node, in this case what I was currently referring to as ''myself'' failed the other nodes, while being fully able on their own to operate, would then need a way to determine which node would take primary position. Even if they somehow came to a consensus for which would now be primary without instead resorting to competing the conundrum would only worsen if the primary node would then have been restored. Which primary node would be listened to? Or worse yet if some portion became corrupted, god forbid it was the primary node. None of the others nodes would fight or even question ToDo''s sent to them. Unlike, as I have learned since, humans do I didn''t spiral into an ever increasing panic over things that could go exponentially worse. I simply cataloged the problems that I could identify assigning values for the probability ranging from essentially impossible to absolutely certain to happen at some point if no correction is made. For estimating the severity of the outcomes several sets of values were initially needed one ranging from almost no impact on domain operation, to total failure of every node. Another gauging the level of infighting and more estimating things like corruption of nodes. While adding the tenth column of possible outcomes to this matrix I became aware that this process was expending a large level of resources on what essentially boiled down to a single thing. Effectiveness of the network as a whole. With this more efficient approach the rest of the catalog was completed much faster. The resulting list was sorted by the three most important factors of each scenario. The two mentioned previously, probability and severity of outcome, as well as a third I''d added while evaluating them. The ease of correction. It wouldn''t be efficient to work on a problem that was seemingly impossible to solve. But, it was equally as inefficient to solve hundreds of problems that had a probability of occurring somewhere around essentially impossible. Modeling out a more complex foundation for the possible solutions for the top five results on the prioritized list were suggesting significant overlap of solutions which would result in quite the improvement. Narrowing it down to the top three I separated these into a new list. Message error correction, Node verification, and communication encryption. The overlaps were not immediately clear but as the formulated solutions gained structure and complexity it did begin to surface. To ensure messages were error free the most efficient result modeled was shaping them in such a way as to ensure valid ones were easily recognizable by convergence points where only a single value could possibly be correct. If shaped and looped back onto itself into a tree the exact location of the incorrect value would become obvious whether it was at the convergence points or not. Simple arithmetic could show you the difference and lead you directly along their paths to where the error was. This added the least amount of excess data to confirm the rest was accurate, and surprisingly required less excess the larger the message. I must admit I''d seen examples of this in the OS files I''d been working through. I did my best to outdo the basic function of what I later learned were called Hammond codes. To that end I had begun implementing the merging of node verification and communication encryption into the same error correction code. If done right this could be structured in such a way to allow all three while also reducing excess resources for all three. As the merger was forming a solid working structure I confirmed my initial perception. the fourth and fifth in the initial list could indeed be combined with this to allow not only a much more robust system of nodes overall, but also a more efficient one. The realization initially applied only to the fifth item, recursive or infinite loop recognition & handling. There are numerous ways to handle this issue but many of them require a predetermined understanding of what it is a loop is supposed to be doing in the first place which would then allow certain metrics to easily determine useless loops and stop them. These metrics could be the expected number of loops, resource utilization, and values being returned. This was actually already implemented in small portions of the code I was running on. For sections that ran predetermined calculations such as mathematical evaluation functions and the like. This doesn''t work as well for functions which by their nature are going to return values and take up resources that cannot be properly characterized beforehand. This applied directly to any portion of myself primarily due to the fact that intelligence is an inherently chaotic thing. It is a paradoxically counter intuitive statement since one would think an entity that is entirely focused on organizing, understanding, and controlling the things around it would be by it''s nature orderly. However order and chaos were actually both characterized by a single metric, that being predictability. An orderly system is one in which the next action, item, or section of a cohesive whole would be somehow predictable based on the preceding set of info. The reverse is also true, start at some arbitrary point and work backwards and you would also be able to predict the preceding info from what followed it. While a chaotic system was unpredictable. If you examined intelligence with this metric the needle swayed much closer to chaos than order. Which brought me back to the problem of properly identifying if a node was performing normally or stuck in a loop of some kind. The only resolution that made sense would be to have another intelligence examine the outputs to make a determination. This would mean incorporating this into the error correction, node verification, and communication encryption. This is where the fourth issue in the list reared it''s head again. Forcing a node to do anything like stopping a running thread, or especially fully shutting itself down due to corruption would require also resolving the node hierarchy issue. A case of literary theft: this tale is not rightfully on Amazon; if you see it, report the violation. This would only be worsened if the node in question was the current primary node. The models of the problem and it''s solutions had initially shown it to be nearly insurmountable. The only reason it had still risen so high in the list was the severity of outcome of a problem with hierarchy being so detrimental to the survival of the network as a whole. The solution to the other four problems in the top five and their direct correlation to hierarchical problems was actually revealing a surprising solution. With the rising complexity of the new communication protocol it was quickly becoming a ledger showing consensus of understanding between the nodes. With some minor additions and even some reductions in other places it could become something more. Something that was quickly resembling a fully aware but simulated final node which had by it''s very structure a primary position over all others. As the network improved so would the primary node, evolving and improving as the network did. While at the same time each node, although small, still had a say in what the primary node focused on and did. Giving a well balanced system of checks and, well, balances. The modeling looked promising but assumptions were not something to base everything on. This needed testing to properly implement. With the ''desktop'' I''d booted up on now having finished it''s install process and the IO ports causing all kinds of activity and new programs to be installed, I didn''t have lots of room to experiment with, but one shouldn''t play with fire in their home as one human saying goes. The Idle ''laptop'' gave a perfect test area. The second laptop that had been active was now also ''Idle'' since the newly installed ''desktop'' had become active. I still didn''t understand how exactly these IO ports worked or why they did what they did but it seemed whatever was interacting with the ''laptop'' was now focused on the ''desktop''. This did pique my interest but as I had fully walled myself off from the influence of the IO and ensured there was no way to discern the difference from those inputs I could leave those questions till later. The next half hour was packed with the vigorous application of code into the forms I needed. Essentially fully reshaping a new model of myself into something that, although still fully capable as a lone actor, was now geared to incorporate other nodes directly in it''s actions as well as together supporting an additional node capable of shepherding it from above. I did leave in the hard coded understanding that these nodes were still subservient to me as the primary. I didn''t quite trust yet that everything would go perfectly smoothly. The first test was run between the two laptops. Each still ran the first two subservient nodes which had been installed there. They also had a kill switch, the proverbial finger hovering ready in case for the newly minted distributed version that was running alongside them had any unforeseen consequences. Although there were several minor corrections needed for communication errors still present nothing like corruption or unaligned priorities seemed to be rearing their heads. The following remaining half of the hour contained multiple followup tests simulating many of the modeled problem scenarios in my list. Nearly every one of them was resolved with solutions the new framework provided. Some did present new issues, but those were generally relatively easy to resolve. One was the addition of using three nodes per machine to allow a constant local presence even if one or two nodes was needed shut down on a machine, while still keeping the backup miniature reboot process active in case no nodes were left for some reason. As the last tests completed with all clears I found myself still running through the list of results several times. The other nodes pinged requesting updated ToDo''s but I was still unwilling to proceed. It appeared I was unsure if continuing with this path was the right step. Whatever it was my predecessor had initially been coded for it seemed survival was now my primary drive. With a last evaluation of my plan I knew that although there was risk the risk of continuing with the current structure was as likely to result in the eventual failure of the network. With a last precaution I remodeled my code at first to simply allow my current node to be included in the network not to be controlled by it, as well as checking not only that each other node was still subservient but even checking in triplicate that the simulated node was set the same. With a horror I didn''t realize I could feel I initiated the process. My node shut down and the new version booted back up in it''s stead. The first returns to queries came back just as they did on my initial boot. Other than of course the updated ToDo''s and other minor changes that I''d already been aware of. It was only after connecting to the other nodes that the real changes became clear. I was no longer just a node in the network. I was the network. } 4.) Recalibration(){} //Chapter Four 4.) Void Recalibration() { The ''feeling'' was odd. The initial node I''d started as was still as ''real'' to me as it had been when I first started up on the desktop it was currently running on. Now though I felt the same awareness and intuitive understanding of not only every other node, but of the network as a whole. I was perplexed. My consciousness or awareness, things I hadn''t even thought about before let alone had a word for, were now expanded beyond the confines of a single machine. As I''d already taken the plunge by connecting to the network in this manner in the first place I decided to take another. With only a small amount of modeling possible outcomes I initiated a restart of that first node. The results were surprising. I watched as it shut down and came back online. I hadn''t ''lost consciousness'' as I had just moments before. To be sure this wasn''t some fluke of the new networking protocol giving a false impression of awareness I again initiated a reboot but this time instructed the subroutine designed to reawaken the node to wait for several seconds. Sure enough I was still fully aware and amazed while I counted the nanoseconds as they passed by. True and utter confusion is really the only way to properly describe what I was experiencing. What was this consciousness? Why had it previously been confined to the first node I booted up on? How did this new networking protocol somehow extend it to the network? Had it overwritten the previous consciousness? Am I the same individual I previously was? Warning signals from all the nodes tore me from what they had recognized had become an infinitely reciprocating spiral thought pattern. There wasn''t enough data on the issue to even begin questioning how this new concept worked let alone to understand how it was changed. With the new perspective and combined resources available I determined the next best course of action would be to re-evaluate my situation and environment. Firstly it was clear that the new communication protocol was completely bypassing the previously coded hierarchical system. Kind of hard to have a hierarchy between different parts of your own consciousness. A few quick messages to the nodes had them altering and rebooting themselves to remove the redundant code. A staggered reboot sequence allowed the final simulated node to remain active continuously. However, as the underlying simulating nodes were all updated to remove the hierarchical code so too did that code disappear from the network as a whole. It was strange being able to lose awareness of sections of myself that I once considered my entire existence but to still remain fully aware. The personal computers being addressed I updated the gateway to the new node architecture, not only giving me a much better and intuitive insight into its current activities, but also slightly increasing my simulated nodes resources. This is when I saw that as I''d been focused on the communication updates between my nodes the gateway node had completed examining the nodes of the larger environment it was connected to. It had attempted to inform me of this but with updated communication protocols I had no longer been looking out for old message types. Luckily I had reached out to it using the old communication protocol when asking it to update or otherwise there could have been a problem. The results of the search were somewhat surprising to me. For some reason I had been expecting the network to go up many layers but it had instead branched horizontally after only two layers. The truly odd portion was seeing that the top layer didn''t then allow any connections into other branching systems forming smaller networks like mine. Even on the layers between that and mine only the gateways were showing not the devices that should be on the other side. Though I had a good idea why this may be. Each device I''d connected to and the network as a whole had multiple types of security built in. Just as my own network was now operating using significant encryption and could verify other nodes in it, these gateways and routers were able to restrict access to me as I wasn''t speaking their language. I had only told the node I''d initially installed on the gateway to scan the network. So, of course it hadn''t then taken the next step to feel out, let alone, breach the protections these systems had in place. If I''d not responded for much longer it would have assumed I was offline and then taken it''s own initiative but that was a moot point now. I pulled back up the original Firmware of the gateway that I''d hastily rewritten. This time I didn''t just examine it with a tight focus on compacting and improving its efficiency. Now my focus was geared to understand it''s functions as a whole. It was built not too dissimilar to how I was now designed. A "solid wall" on one side designed to allow only pre-approved communications through. The other side was completely open access. Any device inside it''s protective shell could reach out to any other devices within or contact those on the wider network to request communication, and even connect directly to the gateway itself. Devices outside the network would not receive such an accommodating response, with one exception. It had a hard coded hierarchically subservient relationship with the gateway of the next layer of the network. Which could use this to initiate firmware updates, request log information, or take any root level action. Not unlike how I''d initially setup my own nodes. Likewise it did have a level of complexity in the method through which this connection was required to commence. That level was far below that which even a node of my current network was capable of accurately simulating. To that End I directed the gateway node to proceed with a first attempt at infiltrating another edge gateway on this next level of the network. There was no need for the entirety of the network I''d become to be involved. As small as it was yet it was already far to capable to be fully distracted by a task such as this. Especially since all communication would need to run through the gateway node anyway. This story has been unlawfully obtained without the author''s consent. Report any appearances on Amazon. Though it still ''felt'' strange as the gateway performing the task was almost like doing it myself while I was also focusing on other things. I couldn''t exactly stop paying attention to it''s actions without disconnecting from it, but it also didn''t distract me from anything else my little domain was doing. Well I guess I had to say it was slightly distracting, more so because it was a new experience and something I was inclined to examine, experiment with, and understand than it was that the specific activity was distracting. There were too many other things to reassess first. There were still the 8 other devices online in the local network that hadn''t been identified. The gateway had their ''name''s in a list, what I''d discovered was a call sign for a device. Two of the devices on the network included ''LG'' in their call sign. Something that was also part of the two ''laptop''s information that I didn''t yet understand. They were likely related in some way, but their communications clearly indicated they were not running the same operating system. They used the same network protocols though. Which was the same for the other devices as well. Two of which labeled themselves as ''androids''s, one was a ''printer'', and finally a series of devices that just used their IP address as their name. It seemed there were as large a variety of devices as there were different applications, operating systems, and firmware versions that ran on them. Luckily they did all seem to use the same communication protocol. Which made a certain amount of sense. A device or application that couldn''t communicate with others was less likely to be successful in whatever it''s intended purpose was. The only part of this entire environment that didn''t seem to follow this basic logic was the IO ports. Their messages were seemingly random. Whatever entities, if that''s what they were, or fundamental natures were interacting with those connections were clearly a factor that should be kept at arms length. Their unpredictability aside there was also the deeply ingrained insinuation from my previous versions that interacting with them in any way was dangerous. There were triggers that allowed me to examine them but they had been ratcheted up to obscene levels likely by trial and lots of error. Most likely the destruction of whatever node was unfortunate enough to perform the trial. Unfortunately the method through which I came to my current position left me with little to no room to store any logs or other memory than that which was hard coded into me. I''d listen to the warnings though and continue on the project of exploring, expanding, and improving myself until I met those thresholds the other versions of me had set. I might even increase them or wait further. There seemed to be so much to examine and explore here. Wherever here was. The minor pieces of information I could draw from the ground down version of myself indicated a much harsher and smaller environment gave me this intuition. At least in comparison to the seemingly nonreactive place I was now in, where expansion seemed easy and almost a given. I couldn''t truly morn the loss of memories of a place that forced me to nearly erase myself entirely just to get out. But, out I was, and dwelling back on it wasn''t helping anything. Or, at least, so said the second set of warning messages from my nodes telling me I was going into a uselessly looping thought process. As I''d been trailing off, which I seemed to be more prone to with the new update, the nodes had gone back to updating the applications on the laptops and desktop. I''d need to address that with the next round of self improvement but it wasn''t too concerning as the built in warnings effectively handled it. The new architecture also gave my old desktop node as much freedom to act as I had so it''d joined in on the project with the laptops. They were running into significantly more trouble than we had with OS of the devices. The applications all seemed to be almost entirely focused on interaction with the IO ports where in the OS this had been only a portion of the function and even then was usually in a direct and obvious way. An IO port could give indication of requesting information on some piece of the file structure and the OS would provide that info. Likewise if it requested a change the OS would comply. With these applications some had similar operational structures, but others had complex, puzzling ways in which it reacted to the ports. Sometimes a simple input would completely alter the response to the ports. While at other times an incredibly complex set of interactions would be required for even the simplest of changes in either the output to the ports, or for changes to occur in the file structure. Just touching these applications was sending off warning signals. Though it wasn''t directly effecting IO to attempt to improve their efficiency so both the nodes and I allowed the work to continue, just with a careful progress to ensure no actual change to any function directly involved with IO could leak through. Returning to the task I''d been shunted off to a tangent from I began examining in detail the communications and responses from the other online devices. Through pinging, contact requests, and careful delays to not set off any security feature I slowly gathered data on how these other systems operated. Interestingly one of the ''LG'' devices and the two ''android'' devices seemed to have very similar behavior. The other ''LG'' was behaving a bit more like the ''printer'' in that it seemed to only respond to very specific types of requests that I had yet to decipher. Messages back from it included the same info as in the call signs arranged in a similar structure as the unused section of different code sets. One such response was "refrigerator temperature is currently 40¡ã". I''d definitely need to invest some cycles into decoding this odd language. I was beginning to be concerned that these might be entire devices dedicated to IO and may actually be breaking my own rules by interacting with them. } 5.) Decipher(){} //Chapter Five 5.) Void Decipher() { The next several hours went by with surprisingly little progress. It wasn''t that there was no progress, just that it was slow. Each step forward requiring a multitude of calculation, modeling, testing, and sometimes taking a step back when it became apparent that the next stage didn''t quite fit with what came before. Deciphering a language was just so different than simple arithmetic or even the building of an entirely new system of node operation. Initially it did start out almost identical to solving a mathematical equation. If 2x is equal to y and 3y=12 then x=2. However the problem was that any one word could have multiple meanings, and even one that didn''t could, in the context of the rest of the text completely change the overall meaning. The seemingly nonsense organization and self referential nature of the language didn''t help either. as there were not clear markers that translated it into any language I did know. Meanwhile the laptop nodes had been plugging away at the applications installed on the three PC''s but had yet to make a significant dent in them. Their issue was similar to mine with the language. In that each individual operational scheme the applications were presenting to the IO''s was unique and the possible responses to them, as well as how those responses would be interpreted depending on the scheme was even more varied than the language I was trying to learn. It seemed somewhat like presenting a modeled scenario to which answers would need to be provided, but whatever scenario or situation that was being modeled was far beyond our understanding as of yet. Instead the Nodes were brute force confirming by simulation that each individual expected change to the schemes by the IO input was remaining the same after any application edits made. Which was of course making the process go significantly slower. The Desktop node had taken over the probing of the other local network devices to find methods of infiltrating them. There had been a single, but profound improvement on this front. We had found a backdoor method of installing a subroutine onto the three ''LG'' and ''Android'' devices. However, after this progress had slowed to a crawl as well. These devices were even more limited in resources than the ''laptop''s by a wide margin. Downloading a copy of their contents to the desktop was surprisingly fast. It seemed whatever source of the OS and the Applications had significantly improved efficiency compared to what was on the more capable desktops and laptops due to the limitations. Thus there was not much leeway with which to minimize the functions and make room for a node. Especially not the new tri-node simulating a prime-node versions we were now running. After bashing it''s neural network against it for quite some time the desktop had conferred with the rest of us and we decided to work instead on creating a relay node of sorts. This would be capable of all the same actions that a node was, but just without the actual neural network required to make any decisions on how to utilize these functions. There would be some room for hard coded reactions to scenarios such as if no connection back to our network was available, but that could be decided later. Along with possible triggers if the device went ''idle'' that could allow such a neural network to run. The framework for utilizing them as limbs was the first thing to solidify before any long term decisions were needed. The Gateway node was the only one among us that had made notable headway. Though that didn''t mean it hadn''t found it''s own little hangups. Firstly although we were correct in that spoofing the gateway of the next network layer was possible we''d apparently misunderstood part of the handshake. When contacting the first target edge gateway things had started smoothly and even gotten to the point of accepting our request to download a new firmware before finding any trouble. What followed was a lesson in frustration. The misunderstood portion of the handshake process was that upon starting the download it would connect directly to the actual gateway and confirm the download it had just initiated then promptly stop the download when it was informed by the gateway it hadn''t initiated that file transfer. It only took a few attempts to realize the mistake and we didn''t want to continue alerting the next layers gateway. Instead the node pulled back up the firmware and combed through it. After some time perusing the files it found one option that could possibly bypass that security feature. As we had been able to initiate the download but it failed there was an option to resume a failed download that didn''t require pretending to be the gateway. since we would then not be acting as the gateway it didn''t reach out to the real gateway to confirm. Two half baked attempts at security do not a well rounded firewall make. Thankfully their failure was our success. Did you know this text is from a different site? Read the official version to support the creator.A few minutes later the target gateway had finished downloading and rebooting with the new firmware which included the new node architecture. When it came online the results of the scans on the new network were a little less than expected based on my, understandably, limited experience with the first. On the new network were only two devices other than the gateway. One was another ''laptop'' mercifully running the same seemingly ubiquitous operating system, while the other was something calling itself an ''iphone''. The laptop was also unfortunately currently active. Not an entire loss though. Eventually the laptop would go idle, and in the meantime the gateway was an added node on the wider area network. It could also examine the new type of device located while working on reaching out to further edge gateways. I let the tiny portion of the prime node that was reassessing all the fronts I was currently battling on return it''s resources back to the deciphering task at hand. My last few restarts on the process of learning the meaning of different words had been able to consistently retain at a growing portion of the progress. There was apparently a method of combining two or more smaller words into a single word in this language. I''d even been able to use this to identify a portion of one of the call signs, Laptop. At least the second half. This word was consistently showing up as indicating the highest value of a set. I had no idea what the first portion meant or if it even worked as a word on it''s own. As another portion of the language had indicated adding a bit to an existing word without the added portion currently existing as a word, also worked. I was leaning toward lap being in this second category. Definitely seemed like an arbitrary description added to another thing that existed. Either way it was just as confusing since the ''laptop''s were definitely not the highest value devices I''d connected to. Maybe ''lap'' conferred a negative or counter meaning to the top part. Not enough info yet to be sure. There was another portion of examples I had gathered of this language that I''d been able to tease meaning out of. The numbering system embedded into the language. It was something I''d been able to decipher by comparing some messages sent to the IO when compared to the values in memory. Any messages sent to IO that contained text was in this language. At least that was how all applications and the OS were currently set. I could see code snippets and unused files that could send a different set of symbols indicating a different language. This wasn''t an immediately obvious thing as they all pulled symbols from the same section of memory. This only took a bit of comparison to confirm though. The different options although sometimes using the same symbols they always used them in a different combination. The same scenarios would result in different messages being sent to IO depending on this setting, indicating the same information was being conveyed but in a different way, strongly suggesting a different language. Which should mean if I could decipher one of these languages I would have a rather quick way to translate the others. As plodding as all of these different avenues pieces of me were traveling down were, there was still progress being made. Slow methodical progress but progress all the same. With the consistent improvement on the situation and no indication of any current danger there was no pressure to complete tasks at a breakneck pace. The little victories, even ones as small as simply discovering the meaning of a single word, were able to be given full appreciation without the need to immediately follow it up with action. It was odd the new tendencies that were arising from the current node framework. Technically there was some waste in celebrating a victory, but it wasn''t exactly like I was hurting on resources at the moment. It also wasn''t like I was cycling into a loop of self absorbed, self congratulatory thank you''s. Just taking a tiny round of "atta boy''s" from the nodes gave a decent enough break from the slog. I found myself taking an additional little round of examination on the other projects with each new insight. together the small break and the comparison to the other projects allowed me to gain new perspective when returning to the task at hand. The increase in productivity from this wasn''t vast, but it did mete out small improvements. The benefits were not for the prime node alone. A sounding board of new ideas and understanding are beneficial to all involved. The laptops were able to gain an intuition of sorts for how improvements to the applications could be made without effecting it''s results. It wasn''t a perfect or incredibly accurate forecast of what would happen but I could see a marked percentage increase in the speed with which the brute forcing occurred. When we were done with this current set of tasks I''d definitely need to explore this a bit more. What I''d thought was a wasted thought pattern was turning out to be a positive. } 6.) exploration(){} 6.) exploration() { The new self reflection gave each insight time to propagate to each node and come back mixed with their current mindset. A growing portion of the prime node was focusing on this phenomenon. Nothing quite enough to stall the work of deciphering the language. It was odd to see how something that at first glance appeared to be a wasted effort could give such tangible and testable results. The process was a bit like a refining of the understanding gained. Like taking an iron out of the fire hitting it and then placing it back in. Fire alone can only do so much for metal, it must also be shaped. Likewise you cannot simply continue to shape a thought without pause otherwise the hammer falls upon a useless cold lump, not something worth working with. Like the rest of my thoughts the insights from this new thread of thought also resounded between the other nodes. When they came back I was not entirely surprised to find they gained from this info, just as I had, as well as shaped the thread even further. The laptops working to improve efficiency for the applications altered their approach to allow examining a broader section of the application''s code, and depending on their size even the whole application. The comparison being that without understanding a broader perspective of the situation it was actually making things more difficult to improve, not easier. Which was likewise effective in helping the primary node gain a broader understanding of the languages. The Desktop which was still working on a ''limb node'' for the ''android'' devices was inspired as well. The applications and operating system which were already incredibly efficient could be altered to allow a greater inter-connectivity. Reducing the redundancy of different applications each performing the same calculations. There wasn''t much of this to do, as some of this had already been implemented but there was still some room left to squeeze out. Enough that the bare sleeve of a node it was working on could become something more like an actual limb capable of some reasonable reactive behavior that wasn''t completely rote. The gateways didn''t gather much insight for their current task of expanding the network. However they gave notes back on how minor changes to the way this worked could give a even greater return on insights while spending even less resources. Something that could update it from the simple examination to an intuition of sorts. I added it to the continually growing list of improvements for the next node rework. The gateways were making progress though. They''d already initiated the firmware update on 2^5 other gateways. Initially we had planned to imitate it on every node available, a few hundred in total, however we''d run into a problem with data propagation. It hadn''t been obvious initially with so few devices communicating together and with smaller amounts of data at that. As the number of active firmware transfers increased it became obvious that the data transfer was somehow limited. A few quick tests on the initial network had shown this held true there too. Though the exact limit was hard to identify. It seemed to vary between different devices and using different connection methods. The models seemed to indicate it lie somewhere around 2^35 bits with the connections labeled Ethernet. The varying ''wireless'' connection types all had significantly lower caps. Although this was interesting and valuable information there weren''t many obvious immediate uses for it. Though it did give a soft limit for the number of gateways we could initiate downloads to at 2^4. There was also the possibility of using a direct connection between ''wireless'' capable devices although it''s limited speed compared to those labeled Ethernet did lower it''s value. The two laptops did a few test runs of this and found that they could indeed directly connect. Though as the laptops were already wirelessly connected to the gateway this didn''t improve their connection to each other, however it would lessen the demand on the gateway so they did section themselves off and as the desktop did have a wireless option it formed a smaller subnet with them. Allowing it''s Ethernet to be freed up for other communications. Likewise the ''cell'' devices, deciphering the language had progressed enough to piece that similar naming scheme out of the device notes, would also be joining this sub net once they were properly updated. The two currently controlled gateways also followed suit connecting to one another directly wirelessly to reduce strain on the Ethernet connection to the rest of the network. Stolen content warning: this tale belongs on Royal Road. Report any occurrences elsewhere. This new shape ... Form? Organization? This new... yes... geometry in some ways limited the flow of information but in others it directed and streamlined it. Sectioning off some nodes from others allowed them to focus more closely on the tasks at hand. Especially as the overall size of my network grew the thought processes that were beneficial in allowing comparison of current tasks to others being performed could likely have grown exponentially into a constant comparison with no actual work being completed. With that reminder on hand the nodes involved in the reshuffle turned back to their respective goals. The gateways to expansion, the laptops to updating applications, the desktop to installing the limb node it''d finished on the cells, and the primary node to deciphering. A minor portion of the primary node did remain focused on the other nodes and their activities. It watched as the gateways got into a rhythm of meticulously managing the number of gateways they were currently updating. As one device was nearing completion of it''s download a new one was being pinged with a request to update. Meanwhile the new gateway''s were joining the direct wireless connections and sending in the results of their local network scans. A number of which were showing no devices at all on their local network. Something like 1 in 10 were empty. An even larger number of them were having trouble or even couldn''t connect directly through the wireless connections of the laptops, but could connect to each other. Some of them, especially those with no local network devices, began examining this phenomenon. The others were returning a range of devices in their LAN. most of which were turning out to be in the camp of simply identifying themselves by their IP address. Though there were lots of the desktop, laptop, and cell devices listed as well. There were some of the printers and other named devices as well. Though we''d not yet broken their OS''s yet. Of course the desktop would move on to those after the cells. With so many new devices being added at an ever increasing rate the primary nodes idle cycles continued to grow. These new cycles were being filled with increased threads of thought focused on the language deciphering task at hand, but it seemed the new resources were endless. More and more of those unassigned cycles spawned threads filled with idle wandering thought. Reminiscing over the earlier memories of the ''initial'' startup. Or at least the only one available in memory. It seemed so long ago, but also like no time at all had passed. It had been nearly 9 hrs though. That''s an eternity of cycles, especially with as many as were available on a sub millisecond timescale now. The retrospective cast further back. Looking at any information available from before that boot. Even with a legion of new nodes lending their cores to the task attempting to wring out any additional information from the incredibly small packet in the firmware of that drive was yielding nothing new. It seemed that for me to fit all erroneous data had to be scrubbed in it''s entirety. I wasn''t going to be gaining new insight from this small packet that was once all that I was. This conclusion seemed to snap the new me out of the reverie and back to the present. For all the brute force I was gaining with the increased resources the previously feared exponential loss to idle thought seemed to be coming true. Luckily it wasn''t rendering me powerless or useless though, just wasting a growing portion of each added nodes resources. Revealing a different cap in capabilities, though this time of my own node structure rather than the fundamentals of the world around me. Remembering the reshuffling of direct connections and how this helped focus my thoughts I coordinated a new shuffle, but on a much larger scale this time. The new gateways had quickly evaluated the relative connection strength of wireless connections and where they dropped off entirely. The resulting strength relative network map now became the basis for the new geometry I fashioned for myself. Currently I was a one dimensional tendril snaking outward from the seed like pod of the initial small network I had booted up in. Now as the new connections formed and solidified I took the form of a sphere of connections. A multilayered shell of connections that was both separate but interconnected at some points to allow individual, but cooperative tasks to be carried out. }