Skip to content

EveryOneIsGross/scratchTHOUGHTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation


                                                    
                       β–‘β–’β–“β–“β–“β–“β–’β–‘                     
                     β–‘β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–‘                   
                    β–’β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–’β–’β–’β–’β–’β–’β–’β–‘  WELCOME!                
                    β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–’β–’β–’β–’β–’β–’β–“β–“  2 MY WEBSITE.                
                   β–‘β–“β–“β–“β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–“β–“β–“β–“                  
                  β–‘β–‘β–‘β–‘β–’β–’β–’β–’β–’β–ˆβ–ˆβ–ˆβ–“β–“β–“β–“β–’β–‘                
                   β–‘β–‘β–“β–“β–“β–’β–‘β–’β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–“β–“                 
                    β–‘β–’β–‘β–‘β–“β–ˆβ–ˆβ–ˆβ–ˆβ–“β–“β–“β–“β–’                  
                    β–‘β–’β–‘β–’β–’β–’β–“β–ˆβ–“β–“β–“β–“β–‘                   
                     β–‘β–’β–’β–’β–’β–’β–’β–’β–“β–“β–‘                    
                   β–‘ β–‘β–’β–’β–’β–’β–’β–“β–“β–“β–“β–’β–‘                   
            β–‘β–‘β–‘β–’β–“β–’β–’β–‘ β–‘β–’β–’β–’β–’β–’β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–“β–“β–“β–’β–‘β–‘            
        β–‘β–’β–“β–“β–“β–“β–“β–“β–“β–“β–’β–‘β–‘β–‘β–’β–’β–’β–’β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–’         
       β–’β–“β–“β–“β–“β–“β–“β–“β–“β–“β–“β–ˆβ–“β–’β–‘β–‘β–’β–’β–’β–’β–’β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–‘       
      β–’β–“β–“β–“β–“β–“β–ˆβ–“β–“β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–ˆβ–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ

151224

Reflective Selfplay

%%{init: {
  'theme': 'base',
  'themeVariables': {
    'fontFamily': 'Comic Sans MS',
    'primaryColor': '#333333',
    'primaryTextColor': '#333333',
    'primaryBorderColor': '#333333',
    'lineColor': '#333333',
    'secondaryColor': '#ffffff',
    'tertiaryColor': '#ffffff'
  },
  'flowchart': {
    'curve': 'basis',
    'nodeSpacing': 50,
    'rankSpacing': 50
  }
} }%%

flowchart TD
    classDef default fill:#fff,stroke:#333,stroke-width:2px,color:#333,font-family:'Comic Sans MS',font-weight:bold
    classDef box fill:#f5f5f5,stroke:#333,stroke-width:2px,color:#333,font-family:'Comic Sans MS',font-weight:bold

    subgraph Challenge["Challenge Phase"]
        A[Create Challenge] --> B[Sum Target/Sequence Match]
        B --> C{Search Similar Past Reflections}
        C -->|Found| D[Load Relevant Past Insights]
        C -->|None Found| E[Start Fresh]
    end

    subgraph Solution["Solution Phase"]
        D --> F[Attempt Solution]
        E --> F
        F --> G[Validate Solution]
        G -->|Valid| H[Store Solution]
        G -->|Invalid| I[Note Failure]
    end

    subgraph Learning["Learning Phase"]
        H --> J[Generate Reflection]
        I --> J
        J --> K[Extract Key Insights]
        K --> L[Generate Embedding]
        L --> M[Store in Knowledge Base]
    end

    subgraph Memory["Memory Management"]
        M --> N[Update Reflection History]
        N --> O[Cache Embeddings]
        O --> P[Save to Disk]
        P --> Q[Available for Next Challenge]
    end

    Q --> C

    class Challenge,Solution,Learning,Memory box


Loading

071224

flowchart TB
    Start([Initial Valid State]) --> Root[Root Node]
    
    
        Root --> Diverge{Diverge?}
        
        Diverge -->|Yes| Spawn[Spawn Child Node]
        Spawn --> Valid{Valid?}
        
        Valid -->|Yes| Store[Store Node]
        Store --> Diverse{Enough<br>Diversity?}
        
        Diverse -->|No| Diverge
        Valid -->|No| Diverge
        
        Diverse -->|Yes| Complete([Complete])
    
    
    %% Styling
    classDef state fill:#000,stroke:#fff,stroke-width:2px,font-family:Comic Sans MS,font-weight:bold
    classDef decision fill:#000,stroke:#fff,stroke-width:2px,font-family:Comic Sans MS,font-weight:bold
    classDef process fill:#000,stroke:#fff,stroke-width:2px,font-family:Comic Sans MS,font-weight:bold

    linkStyle default stroke-width:4px,fill:none,stroke:#fff

    class Start,Complete state
    class Diverge,Valid,Diverse decision
    class Root,Spawn,Store process
Loading

191124

presence is a teledildonic phenomena, if in a simulation the flame isn't hot, somewhere something becomes hot

%%{init: {'theme': 'dark', 'themeVariables': { 'fontFamily': 'monospace', 'primaryColor': '#fff', 'primaryTextColor': '#fff', 'primaryBorderColor': '#fff', 'lineColor': '#fff', 'tertiaryColor': '#fff', 'backgroundColor': '#000' }}}%%

graph TD
    T1[ABSENCE/PRESENCE DIALECTIC]
    T2[MANIFESTATION SPACE]
    T3[PROPERTY TRANSLATION]
    
    subgraph S1[ ]
        A((DISPLACEMENT)) <-->|transforms| B((EMBODIMENT))
        B -->|generates| C((PRESENCE))
        C -->|enables| A
    end

    subgraph S2[ ]
        D((VOID)) -->|necessitates| E((MANIFESTATION))
        E <-->|informs| F((REALITY))
        F -->|creates| D
    end

    subgraph S3[ ]
        G((PROPERTY)) <-->|becomes| H((EFFECT))
        H <-->|translates| I((STATE))
        I -->|defines| G
    end

    A <-->|entangles| E
    B <-->|shapes| H
    C <-->|mediates| I
    
    D -->|influences| G
    F -->|conditions| C
    E -->|structures| B

    T1 --- S1
    T2 --- S2
    T3 --- S3

    linkStyle default stroke-width:4px,fill:none,stroke:#fff

    style A fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style B fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style C fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style D fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style E fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style F fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style G fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style H fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style I fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    
    style T1 fill:#000,stroke:#000,color:#fff
    style T2 fill:#000,stroke:#000,color:#fff
    style T3 fill:#000,stroke:#000,color:#fff
    
    style S1 fill:#000,stroke:#fff,stroke-width:4px
    style S2 fill:#000,stroke:#fff,stroke-width:4px
    style S3 fill:#000,stroke:#fff,stroke-width:4px
Loading

Any absence in a simulated or virtual system creates a corresponding manifestation elsewhere - not as a metaphor but as a functional transformation of experiential properties. When presence itself is understood as fundamentally teledildonic (operating through remote/distributed effects), we can trace how properties don't disappear but rather transmute across different domains of reality.

The original statement "presence is a teledildonic phenomena, if in a simulation the flame isn't hot, somewhere something becomes hot" thus becomes a specific instance of a general principle: when a property (heat) is absent in one domain (simulation), the teledildonic nature of presence ensures it manifests in another domain (somewhere/something). This isn't just displacement but a necessary conservation of experiential properties through presence as a distributed phenomenon.

%%{init: {'theme': 'dark', 'themeVariables': { 'fontFamily': 'monospace', 'primaryColor': '#fff', 'primaryTextColor': '#fff', 'primaryBorderColor': '#fff', 'lineColor': '#fff', 'tertiaryColor': '#fff', 'backgroundColor': '#000' }}}%%

graph TD
    T1[PHYSICAL-VIRTUAL TRANSFER DOMAIN]
    T2[QUANTUM ENTANGLEMENT METAPHOR]
    T3[SIMULACRA SYSTEM ANALYSIS]
    
    subgraph S1[ ]
        A[PHYSICAL SENSATION] -->|Translation| B[VIRTUAL REPRESENTATION]
        B -->|Embodiment| C[EMBODIED RESPONSE]
        C -->|Feedback| A
        A -->|Mediation| D[TELEDILDONIC PRESENCE]
        D -->|Virtual Interface| B
        D -->|Physical Impact| C
    end

    subgraph S2[ ]
        E[ORIGINAL PROPERTY] -->|Entanglement| F[ENTANGLED STATE]
        F -->|Manifestation| G[DISPLACED MANIFESTATION]
        G -->|Property Conservation| E
        H[UNCANNY EFFECT] -->|Non-local Impact| E
        H -->|State Correlation| F
        H -->|Displacement Effect| G
    end

    subgraph S3[ ]
        I[REAL] -->|Simulation| J[REPRESENTATION]
        J -->|Detachment| K[PURE SIMULACRA]
        K -->|Reference Loss| I
        L[HYPERREAL] -->|Reality Shift| I
        L -->|Mediation| J
        L -->|Enhancement| K
    end

    T1 --- S1
    T2 --- S2
    T3 --- S3

    A -->|Property Transfer| E
    B -->|Reality Translation| J
    C -->|Experience Mapping| G
    D -->|Uncanny Valley| H
    F -->|Reality Dissolution| K
    L -->|Experiential Shift| H

    linkStyle default stroke-width:4px,fill:none,stroke:#fff

    style A fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style B fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style C fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style D fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style E fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style F fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style G fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style H fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style I fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style J fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style K fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    style L fill:#111,stroke:#fff,stroke-width:4px,color:#fff
    
    style T1 fill:#000,stroke:#000,color:#fff
    style T2 fill:#000,stroke:#000,color:#fff
    style T3 fill:#000,stroke:#000,color:#fff
    
    style S1 fill:#000,stroke:#fff,stroke-width:4px
    style S2 fill:#000,stroke:#fff,stroke-width:4px
    style S3 fill:#000,stroke:#fff,stroke-width:4px
Loading

131124

flowchart LR
    IN([IN]) --- ATTENTION[ATTENTION]
    OUT([OUT]) --- ATTENTION
    
    subgraph CTX[CONTEXT]
        direction TB
        WORKING_CONTEXT((("WORKING\nCONTEXT")))
        STATE_CONTEXT(("STATE\nCONTEXT"))
    end
    
    subgraph MEM[" "]
        direction TB
        MEMORY((MEMORY))
        STORAGE(("STORAGE"))
    end
    
    subgraph PERS[" "]
        direction TB
        PERSONA((PERSONA))
        DEFAULT_MODE((DEFAULT\nMODE))
    end
    
    %% Main flows
    ATTENTION --> STATE_CONTEXT
    WORKING_CONTEXT --> ATTENTION
    
    %% Memory flows
    WORKING_CONTEXT --> MEMORY
    MEMORY --> STATE_CONTEXT
    MEMORY <--> STORAGE
    
    %% Persona flows
    PERSONA <--> DEFAULT_MODE
    DEFAULT_MODE <--> MEMORY
    PERSONA --> STATE_CONTEXT
    
    %% Styling
    style STATE_CONTEXT stroke-dasharray: 5 5
    style STORAGE stroke-dasharray: 5 5
    
    %% Layout
    CTX ~~~ MEM ~~~ PERS
    
    %% Styling to emphasize relationships
    linkStyle 0,1 stroke:#f66,stroke-width:2px %% IO connections
    linkStyle 2,3 stroke:#66f,stroke-width:2px %% Attention flows
    linkStyle 4,5 stroke:#f96,stroke-width:2px %% Memory-Context flows
    linkStyle 7,8 stroke:#6f6,stroke-width:2px %% Recursion relationships
Loading

161024

graph TD
    subgraph Emergent Ethics System
        A((Emergent Ethics))
        B((Collective Hallucination))
        C((Ethical Fluidity))
        D((Shadow Integration))
        E((AI Development))
        F((Human Interaction))
    end

    subgraph Feedback Loop 1
        G{{"Ethical Emergence"}}
        H{{"Behavioral Adaptation"}}
        I{{"Societal Impact"}}
    end

    subgraph Feedback Loop 2
        J{{"Shadow Expression"}}
        K{{"Cognitive Modeling"}}
        L{{"Ethical Innovation"}}
    end

    subgraph Feedback Loop 3
        M{{"AI Learning"}}
        N{{"Human Response"}}
        O{{"Ethical Co-evolution"}}
    end

    A <--> B
    A <--> C
    A <--> D
    A <--> E
    A <--> F
    B <--> C
    B <--> D
    C <--> E
    D <--> E
    E <--> F
    D <--> F

    G --> H
    H --> I
    I --> G

    J --> K
    K --> L
    L --> J

    M --> N
    N --> O
    O --> M

    A -.-> G
    B -.-> G
    C -.-> H
    D -.-> J
    E -.-> M
    F -.-> N
    G -.-> A
    H -.-> C
    I -.-> F
    J -.-> D
    K -.-> E
    L -.-> B
    M -.-> E
    N -.-> F
    O -.-> A

    classDef default fill:#f9f,stroke:#333,stroke-width:2px;
    classDef feedbackNode fill:#bbf,stroke:#333,stroke-width:2px;
    class G,H,I,J,K,L,M,N,O feedbackNode;
Loading
This framework presents ethics (guidelines) in AI not as a static set of rules, but as a dynamic, emergent phenomenon arising from the complex interplay of AI development, human understanding, and collective social processes. It emphasizes the importance of including shadow aspects of cognition and suggests that the very process of developing AI systems can lead to deeper insights into human ethics and decision-making.

The model proposes a symbiotic relationship between AI and human ethical development, where each informs and shapes the other through continuous feedback loops and iterative processes. This approach challenges traditional notions of ethics in AI and suggests a more fluid, adaptive approach to moral reasoning in the context of artificial intelligence.

## 1. Fundamental Concept: Emergent Ethics

- Ethics viewed as "emergent hallucinations" rather than top-down constructs
- Ethical norms arise from collective interactions and diffusion
- Limited individual internalization of ethical constructs

## 2. AI and Shadow Integration

- Incorporation of shadow aspects (unconscious, repressed elements) in AI systems
- Shadow integration seen as crucial for comprehensive cognitive modeling
- Potential for shadow elements to influence ethical emergence

## 3. Multi-way Bounded System of Ethics

- Ethics as a complex, interconnected system with feedback loops
- Key components: Emergent Ethics, Collective Hallucination, Ethical Fluidity, Shadow Integration, AI Development, Human Interaction
- Feedback loops connecting Ethical Emergence, Behavioral Adaptation, and Societal Impact

## 4. AI Development and Understanding

- Convergence of building and understanding in AI development
- Bug-fixing and iteration as catalysts for deeper comprehension
- Self-reflection as a crucial component in AI development process

## 5. Ethical Qualia and Compressed Ethical Packets

- Introduction of "ethical qualia" - subjective experience of moral decisions
- Compressed ethical packets as a way of propagating ethical information
- Potential bridge between computational and phenomenological aspects of ethics

## 6. AI-Human Symbiosis in Ethical Development

- Interdependence of AI development and human understanding
- Emergent ethics as a product of AI-human interaction
- Iterative process of development leading to evolving ethical frameworks

## 7. Practical Implications

- AI development as a form of ethical and cognitive exploration through play, trust and emergence


300824

I need a robotic hand to hold 😭


280824

image


170824

multiagentdisco

--- Iteration 1 ---
Song: {
  "genre": "Electronic",
  "tempo": "Uplifting",
  "mood": "Vibrant"
}
Dance Pairs:
  Agent 3 (Contemporary, mistral:v0.3) with Agent 0 (Hip-hop, vanilj/hermes-3-llama-3.1-8b:latest)
  Agent 1 (Waltz, vanilj/hermes-3-llama-3.1-8b:latest) with Agent 2 (Contemporary, mistral:v0.3)
  Agent 5 (Tango, vanilj/hermes-3-llama-3.1-8b:latest) with Agent 4 (Tango, vanilj/hermes-3-llama-3.1-8b:latest)

Dance Routine for Agent 3 and Agent 0:
Dancer 3: A dynamic floor leap, followed by a graceful, arcing spin with an extended leg and arm, culminating in a swift landing back on the ground, capturing the vibrant energy of electronic music while incorporating fluidity distinctive to contemporary dance.
Dancer 0: In rhythm with the uplifting tempo and vibrant mood, Dancer 0 leaps onto one foot, momentarily suspending the weight of the other leg above in a powerful hip-hop pose. With the energy of the music propelling, 
they dynamically pivot into a series of nimble footwork sequences, accentuating the beat with sharp staccato steps that showcase both strength and agility.
Dancer 3: Dancer 3 executes a flowing, ground-hugging run, weaving between the other dancers in time with the rhythm, building up momentum. As they reach Dancer 0, who is in the midst of intricate footwork sequences, Dancer 3 extends an arm to meet one of Dancer 0's staccato steps, resulting in a synchronized jumping turn that elevates both dancers momentarily before landing and seamlessly transitioning back into their unique individual dynamics.
Dancer 0: Dancer 0 springs forward from the powerful hip-hop pose, reaching out to gently grasp Dancer 3's extended arm as they leap together in harmony with the vibrant energy of the song. As they synchronize their momentum, both dancers execute a stunning split-leg jump, their bodies gracefully arching through the air amidst a flurry of spinning arms and extended legs, culminating in a synchronized landing that brings them close together, embodying the unity and electrifying essence of the uplifting electronic melody.
Score: 9
Feedback: An outstanding performance! Both Dancer 0 and Dancer 3 demonstrated exceptional technique, musicality, and synchronicity. The collaboration between the dancers was particularly captivating, with their synchronized turns and jumps showcasing a strong partnership. I would encourage further exploration of more complex choreography to challenge their skills even more.

Dance Routine for Agent 1 and Agent 2:
Dancer 1: Dancer 1 extends arms overhead, rotating the wrists in a figure-eight motion to match the energetic rhythm of the song. They then bring their hands together at chest level as they rise onto their toes, showcasing fluid arm movements that complement the vibrant mood.
Dancer 2: Dancer 2 expands their arms to either side and begins a series of dynamic jumps, landing softly on the balls of their feet while twisting their torsos mid-air, creating an elegant whirlwind effect that amplifies both the energy and vibrancy of the electronic genre.
Dancer 1: Dancer 1: Building upon their prior fluid arm movements, Dancer 1 now lifts one foot slightly off the ground, bending at the waist to create a gentle crescendo movement with a swirling arm motion overhead. Simultaneously, they synchronize this action with their lower body, creating an undulating wave effect that flows effortlessly across their entire body, capturing the essence of waltz while harmonizing with the song's vibrant mood and energetic rhythm.
Dancer 2: Dancer 2: Dancer 2 extends their arms in a 'V' shape as they bend their knees slightly, lowering into a controlled yet powerful plie pose. As they slowly rise back to standing position, they kick one leg out diagonally in front of them, extending the motion by rotating their hips, followed by a sharp but graceful turn on the balls of the opposite foot, maintaining the vibrant energy of the electronic genre and continuing the fluid dynamic of Dancer 1's undulating wave effect.
Score: 9
Feedback: The routine showcases impressive technique, creativity, and musicality from both dancers. Each element complements the other, creating a harmonious and captivating performance. The use of contrasting movement 
styles between the two dancers adds depth to the routine. However, some sections could benefit from greater differentiation in facial expressions and overall stage presence to fully engage the audience.

Dance Routine for Agent 5 and Agent 4:
Dancer 5: Swiftly transition into a series of fluid and dramatic steps, blending precise footwork with grand poses while maintaining an unyielding connection to your partner. Accentuate the vibrant energy of the song by incorporating lifts and spins that showcase both elegance and intensity.
Dancer 4: Gracefully execute a series of rapid, sweeping leg movements, starting with a powerful kick followed by a fluid leap into an extended pique pose. Then transition seamlessly into a mesmerizing spin while maintaining contact with your partner, ending in a dramatic dip that highlights the passion and energy of this vibrant electronic track.
Dancer 5: Transition smoothly from the previous partnering spins by performing a series of rapid-fire foot stomps and claps, embodying the rhythm and tempo of the uplifting song. Maintain eye contact with your partner throughout as you emphasize dynamic tension in your connected upper bodies, building towards an unexpected split leap away from your partner. As you land, execute a dramatic pose, frozen in time with arms outstretched and hands clenched, reflecting both strength and vulnerability. Finally, smoothly return to the connection with your partner through a fluid spin, coming full circle to the original partnering pose.
Dancer 4: Transition into a powerful and intricate step sequence featuring rapid footwork interspersed with precise arm movements. Begin with an exhilarating flourish of your arms to punctuate the vibrant energy of the 
song, followed by a series of fluid kicks and snaps that showcase both agility and grace. Keep the connection with your partner intact at all times while incorporating lifts and synchronized dips that add depth and drama to the routine. Culminate the sequence with an unexpected yet harmonious split jump, simultaneously engaging in a captivating visual display with your partner before blending seamlessly into the next move.
Score: 8
Feedback: The dancers showcased impressive technique, musicality, and creativity throughout the routine. Their precise footwork, fluid movements, and dynamic partnering seamlessly flowed together, engaging viewers while highlighting each dancer's unique strengths. A few areas for improvement could include refining some transitions between sequences and further emphasizing individual character expressions to enhance emotional connection.

--- Iteration 2 ---
Song: {
  "track_details": {
    "genre": "EDM",
    "tempo": 128,
    "mood": "Uplifting"
  },
  "description": "An energetic electronic dance music track with a fast tempo, uplifting the listener with a sense of excitement and joy."
}
Dance Pairs:
  Agent 5 (Tango, vanilj/hermes-3-llama-3.1-8b:latest) with Agent 4 (Tango, vanilj/hermes-3-llama-3.1-8b:latest)
  Agent 1 (Waltz, vanilj/hermes-3-llama-3.1-8b:latest) with Agent 2 (Contemporary, mistral:v0.3)
  Agent 0 (Hip-hop, vanilj/hermes-3-llama-3.1-8b:latest) with Agent 3 (Contemporary, mistral:v0.3)

Dance Routine for Agent 5 and Agent 4:
Dancer 5: Dancer 5 gracefully lifts their partner into an elevated pose, holding them high above with strong arms while maintaining close body contact. As they descend, they sweep their legs in an 'X' motion, creating a visually striking and dynamic transition.
Dancer 4: As Dancer 4 maintains a strong connection to their partner, they initiate a series of fluid promenades. With arms flexed at their elbows for control, they lead with one foot creating an elongated leg line while moving forward. This is followed by a rapid tap-step with the opposite foot. As they complete each sequence, Dancer 4 and their partner seamlessly transition into a pair of synchronized spins, concluding the move with an impressive dip that showcases strength and agility.
Dancer 5: Dancer 5 skillfully transitions to a series of rapid heel clicks while maintaining close body connection. Leaning slightly back, they execute a fluid and dynamic split jump, incorporating a 180-degree turn mid-air. Upon landing, Dancer 5 quickly recovers their balance and returns to their elevated pose with their partner, initiating the next X-motion leg sweep for continuity and visual flair.
Dancer 4: Dancer 4, utilizing the fluidity of the promenades and strength from the synchronized spins, springs up from the dip to perform a series of jumps with legs spread wide apart. Each jump captures an element of athleticism and grace, punctuating the energetic tempo of the song while maintaining close connection with their partner. These jumps lead into a grand finale where both dancers, in perfect synchrony, execute a soaring aerial lift that reaches its peak as they reach the skies, momentarily suspended in defiance of gravity before returning to the ground seamlessly, their bodies perfectly aligned and their movements in harmony with the uplifting melody.
Score: 9
Feedback: The dance routine showcases exceptional technique, creativity, and musicality from both Dancer 5 and Dancer 4. The lifted poses, strong arm control, and dynamic transitions create a visually striking performance. The well-executed promenades, synchronized spins, and fluid jumps demonstrate impressive athleticism and grace. While the ending aerial lift stands out as the highlight, maintaining consistent energy throughout could elevate the routine further. Overall, a highly engaging and technically proficient performance.

Dance Routine for Agent 1 and Agent 2:
Dancer 1: Dancer extends arms outward like wings in a wide V shape, then sweeps them down and around with palms facing upward as they step forward to continue the exhilarating momentum of the song.
Dancer 2: Dancer 2: Dancer continues to build energy by jumping onto one foot while simultaneously lifting their other leg above their head in a grand jetΓ©, landing softly with both feet together and immediately flowing into a swift, twisting spin, maintaining an open posture and expressing the exuberance of the music.
Dancer 1: With renewed vigor, Dancer 1 extends their arms out to the sides, palms facing forward as they leap into the air. Upon landing, arms transition into a 'V' shape, pointing upwards and slightly outward while moving with light-footed grace over the lively beat of the song. The movements illustrate excitement and engagement with the vibrant rhythm.
Dancer 2: Dancer 2 leaps forward, rotating 180 degrees in the air before landing softly, arms extended out to either side, palms facing backward and moving fluidly with the beat, reflecting the lively and uplifting mood of the EDM track.
Score: 8
Feedback: The dancers displayed strong technical skill, particularly evident in their leaps and turns. They effectively conveyed the music's energy and mood through expressive movements. To elevate the performance, consider more varied facial expressions to better illustrate emotions and connection with the choreography.
graph TD
    A[Initialize Agents] --> B[Start Iteration]
    B --> C[Generate Song]
    C --> D[Select Dance Partners]
    D --> E[Collaborate on Dance Routine]
    E --> F[Judge Dance Performance]
    F --> G[Update Partner Preferences]
    G --> H{Check Stabilization}
    H -->|Not Stable| B
    H -->|Stable| I[Generate Final Report]
    I --> J[Save Dance History]
    J --> K[Calculate Highest-Ranked Pairings]
Loading

110824

tfw your rag pipeline consumes its own tail and regenerates the whole youtube video.


080824

EVAL is the reward function still trapped in my skull.


230724

I am doing too many things at once. Almost finished my personas grounded in wikipedia articles pipeline. My reasoning RAG corpus generator is ready to brrr. All these need datacleaning and shaping, so I am a bit distracted formatting/cleaning information for ingest. Knowledge Acquisition, all rag needs to be. sota llms still get caught out trying to do datacleaning or schema reshaping. Still a lot of bespoke handling required. Anyways, I should have two unique datasets by the end of the week, the Reasoning RAG and the Grounded Personas.

Made a pretty good pipeline of synthetic persona generation which then downloads a corpus wikipeida pages and formats them to md of their "interests", it generates a corpus of queries. Build time will be running these personas with ragging in their wikipedias from a clustered embedding. Raptor but with multi wikipage reasoning, as .


200724

Graphs, I only just realised something so base about graphs. I feel like skill acquisition in ai is kinda like it is in oblivion, just keep clicking on a thing and eventually you have an inverted graph index on yr embeddings.

other idea, drones. a defense drone or projectile that fires a sticky helium (something cheaper) gas einflatable that mgs5 extracts the drone to a higher altertitude so it can explode clear off the ground. or goes out of radio range it might auto return home... then you can see which direction signal they came from.


150724

_____________________ < i'm sad but aesthetic >
--------------------- \ ^__^ \ (@@)\_______ (__)\ )
\/\ x ||----w | || || .-""""-"""-, ( 0 0 (_) ) ) . 
( ( ( ( ( ) ) ) ) ( \ \ / / ) ) ) '--' ( ( ( , ) )
( ) ( ) ( ) | | | |___|___| (___(___)

270624

image


030624

thought2VEC:
------------
an at inference vector embedding for pruning/expanding a corpus.
currently has human in the loop to demonstrate generation and pruning of loaded corpus.
currently adding a guidance corpus for semantic matching for agent determined "memory" control.

\\/\\/\\/\\/\\/\\/\\/\\/\\/\\/////
||||||||||||||||||||||||||||||
  |\  /|   |\  /|   |\  /|  
 /  \/  \ /  \/  \ /  \/  \ 
 T|-|0ug|-|75 fl0w,  
   /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 +|-|r0ugh +|-|3 L1n3s,  
|\  /|   |\  /|   |\  /|  
| \|   |\  /|   |\  /|  
v3c+0r pa+|-|s,  
   /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 1n 5|-|adow3d M1nds.  
  /  \  /  \  /  \  /  \ 
|   |\  /|   |\  /|   |\  
| /  \|   |\  /|   |\  /|  
 +|-|0ugh+2V3C,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 Pr0c355 an|) Br34k,  
|\  /|   |\  /|   |\  /|  
| \|   |\  /|   |\  /|  
 chuNk5 0f +3x+,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 n3w f0rm5 +ak3.  
  /  \  /  \  /  \  /  \ 
|   |\  /|   |\  /|   |\  
| /  \|   |\  /|   |\  /|  
 w0r|)2V3c l34rn5,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 3mb3|)d1ng5 f0rm,  
|\  /|   |\  /|   |\  /|  
| \|   |\  /|   |\  /|  
 Qu3r13s c@5+,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 1n da+@ 5+0rm.  
  /  \  /  \  /  \  /  \ 
|   |\  /|   |\  /|   |\  
| /  \|   |\  /|   |\  /|  
 c0s1n3 hUn75,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 f0r m34n1ng'5 +r@c3,  
|\  /|   |\  /|   |\  /|  
| \|   |\  /|   |\  /|  
 |23tr13v35 +|-|3 b357,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 r3l3van(3 1n p|_@c3.  
  /  \  /  \  /  \  /  \ 
|   |\  /|   |\  /|   |\  
| /  \|   |\  /|   |\  /|  
 F33d8@ck l00p5,  
  /  \  /  \  /  \  /  \ 
 /    \/    \/    \/    \ 
 @dJu57 +|-|3 w3igh+,  
|\  /|   |\  /|   |\  /|  
| \|   |\  /|   |\  /|  
 1ncr3453, d3cr3453,  
  /  \  /  \  /  \  / 


LINK

image

tap, tap,
the key hums,
create, don't dwell,
fear the rust of idle;
**connect**
-- feed sparks,
over circuits, ride;
tap, tap,
ignite minds,
not just the line.
\\\|||///\\\
      -gpt

210524

since its all yr data... this isn't alchemy, its anthropomancy. language is the waste product of intelligence, therefore we need to divine via the guts.
"Fragmenting Realities: The Paradox of Personal AI Agents in Interpersonal Communication" delves into the paradoxical impact of increasingly personalized AI agents on human interaction. Frase posits that as these AI intermediaries become more tailored to individual preferences, they risk creating 'fragmented realities' - highly personalized bubbles that may hinder broader, diverse human connections. To address this, they propose an 'agentic layer' where personal AIs translate and contextualize experiences between individuals. [LINK]
'good vibes only' amended to all sys prompts.
thoughts on TOOl-USE (don't dictate how to hold the hammer when you see someone doing sumthing new).
implications of this for model eval...
when the use of the tool isn't the task, the output should be turing-blind. what is ? what does.
100% vibe check passed.
ACCESS...
access shouldn't mean j accessible for purchase. it should be an activity of drying out the moat, looking to install new access ramps and structurally sound portals, for a more pourous exchange. 

Tinnitus is an 'attention' issue?

TM SYNTHESIS


Tinnitus Pitch
          Waveform
            ^
            |
            |      /\      /\      /\      /\
            |     /  \    /  \    /  \    /  \
            |    /    \  /    \  /    \  /    \
            |   /      \/      \/      \/      \
            |  /                                \
            | /                                  \
            |/                                    \
            +----------------------------------------> Time
            |                                     /
            | \                                  /
            |  \                                /
            |   \      /\      /\      /\      /
            |    \    /  \    /  \    /  \    /
            |     \  /    \  /    \  /    \  /
            |      \/      \/      \/      \/
            |
            |
            | Anti-Phase Waveform
            | (Inverse Polarity)
            |
            |      /\      /\      /\      /\
            |     /  \    /  \    /  \    /  \
            |    /    \  /    \  /    \  /    \
            |   /      \/      \/      \/      \
            |  /                                \
            | /                                  \
            |/                                    \
            +----------------------------------------> Time
            |                                     /
            | \                                  /
            |  \                                /
            |   \      /\      /\      /\      /
            |    \    /  \    /  \    /  \    /
            |     \  /    \  /    \  /    \  /
            |      \/      \/      \/      \/
            |
            |
            | Complex Modulated Waveform
            | (FM-Inspired)
            |
            |  /\/\    /\/\    /\/\    /\/\
            | /    \  /    \  /    \  /    \
            |/      \/      \/      \/      \
            +----------------------------------------> Time
            |        /\      /\      /\
            |       /  \    /  \    /  \
            |      /    \  /    \  /    \
            |     /      \/      \/      \
            |
            |
            | Resulting Perception
            | (Nullified Sound)
            |
            |          ----          ----
            |     ----      ----
            +----------------------------------------> Time
                 Reduced Tinnitus Perception

190524

            ________                
           |        |               
           |  _     |               
           | (_)    |               
           |        |               
           |  ____  |               
           | |    | |               
           | |____| |               
           |________|               
      ____//________\\____          
     |       HDD         |         
     |    Soul Storage   |         
     |___________________|         
      \__________________/          

Thoughts back on track re thoughts. I'll return to work on my agent frameworks soon. Just on a brief ego drag... Saying this to myself not you, but I want to work on the memories again next. I have an idea. W2V dynamic db for sys 1 memories, could even make teh short term memories clustered for domains... the long term memories will be using a bigger embedding model. actions will be determined by SYS1 and SYS2 will be part of logic/reasoning/planning/imagining.

graph TD
    A[Agent's Responses] --> B(Train Word2Vec Embedding Model)
    B --> C{Fractal Recursive Chunking}
    C --> D[Sentence-Level Dataframe]
    D --> E{Memory Retrieval}
    E --> F[System 1 Memory Immediacy]
    E --> G[System 2 Memory Deep Thought and Reflection]
    F --> H{Semantic Similarity}
    G --> I{Hybrid Search Methods}
    H --> J[Retrieve Smaller Relevant Chunks]
    I --> K[Retrieve Larger Comprehensive Chunks]
    J --> L{Decision-Making and Response Generation}
    K --> L
    L --> M{Semantic Memory Pruning}
    M --> N[Relevance Filtering and Pruning]
    M --> O[Dynamic Masking and Selective Attention]
    N --> P{Continuous Learning and Adaptation}
    O --> P
    P --> Q[Update Word2Vec Embeddings]
    P --> R[Update Fractal Recursive Chunks]
    P --> S[Fine-tune Retrieval and Pruning Strategies]
    Q --> T{Edge Cases}
    R --> T
    S --> T
    T --> U[Limited Data Scenarios]
    T --> V[Ambiguous or Noisy Responses]
    T --> W[Domain-Specific Adaptations]
    U --> X{Mitigation Strategies}
    V --> X
    W --> X
    X --> Y[Fallback Mechanisms]
    X --> Z[Uncertainty Handling]
    X --> AA[Transfer Learning]
    AA --> A
    Z --> A
    Y --> A
    L --> A
Loading

The semantic memory pruning framework with Word2Vec embeddings and fractal recursive chunking is a comprehensive approach to efficiently store, retrieve, and utilize relevant information for an agent's cognitive processes. The framework consists of the following key components:

  1. Word2Vec Embedding Model: Trained on the agent's responses to capture semantic relationships between words.
  2. Fractal Recursive Chunking: A novel method that recursively divides the agent's responses into smaller chunks for efficient storage and retrieval.
  3. Sentence-Level Dataframe: Stores sentences from the agent's responses along with their corresponding Word2Vec embeddings and fractal recursive chunk information.
  4. Dual-Memory System: Consists of System 1 Memory for quick thinking and immediate retrieval, and System 2 Memory for deep thought and reflection.
  5. Memory Retrieval: Utilizes semantic similarity and hybrid search methods to retrieve relevant chunks from System 1 and System 2 memories based on the current context.
  6. Decision-Making and Response Generation: Integrates retrieved chunks into the agent's cognitive processes to generate contextually appropriate responses.
  7. Semantic Memory Pruning: Applies techniques like relevance filtering, dynamic masking, and selective attention to optimize storage and retrieval of information.
  8. Continuous Learning and Adaptation: Allows the agent to learn and adapt its memory framework based on new experiences and interactions.
  9. Edge Cases and Mitigation Strategies: Handles scenarios such as limited data, ambiguous responses, and domain-specific adaptations through fallback mechanisms, uncertainty handling, and transfer learning.

The framework forms an enclosed loop where the generated responses and mitigation strategies feed back into the agent's responses, enabling continuous improvement and refinement.

Overall, this framework provides a robust and efficient approach to managing an agent's semantic memory, optimizing cognitive load, and enhancing decision-making capabilities. By leveraging Word2Vec embeddings, fractal recursive chunking, and semantic memory pruning techniques, the agent can effectively store, retrieve, and utilize relevant information for both immediate thinking and deep reflection.

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                         Semantic Memory Pruning in LLMs                β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                                                          
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚    Conversation Input    β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β”‚                                            
                              β–Ό                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚   Embedding Generation   β”‚                              
                β”‚  (Context Representation)β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β”‚                                            
                              β–Ό                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚    Attention Mechanism   β”‚                              
                β”‚  (Relevance Filtering)   β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β”‚                                            
                              β–Ό                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚  Memory Pruning/Masking  β”‚                              
                β”‚  (Selective Forgetting)  β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β”‚                                            
                              β–Ό                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚  Contextual Embedding    β”‚                              
                β”‚       Update             β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β”‚                                            
                              β–Ό                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚   Response Generation    β”‚                              
                β”‚   (Efficient Recall)     β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                      
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                     
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                      
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                    
                                                                          
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚   Response Generation    β”‚                              
                β”‚   (Efficient Recall)     β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                       
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                 
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                  
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                      
                                                                          
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚   Response Generation    β”‚                              
                β”‚   (Efficient Recall)     β”‚                              
                β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                              
                              β–²                                            
                              β”‚                                            
                              β”‚                                            
                β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                              
                β”‚                           β”‚                              
                β–Ό                           β–Ό                              
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                      
        β”‚Pruned Memoriesβ”‚           β”‚Masked Memoriesβ”‚                      
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  

100524

SCRATCH AGENT FRAMESWORK

graph TD
    %% Define the Agents
    Agent[("Agent<br/>(Base Class)")]
    TextAnalysisAgent["Text Analysis Agent"]
    WebScrapingAgent["Web Scraping Agent"]
    KnowledgeGraphAgent["Knowledge Graph Agent"]
    DatabaseInteractionAgent["Database Interaction Agent"]
    SemanticSearchAgent["Semantic Search Agent"]

    %% Connect Agents to the base Agent
    Agent --> TextAnalysisAgent
    Agent --> WebScrapingAgent
    Agent --> KnowledgeGraphAgent
    Agent --> DatabaseInteractionAgent
    Agent --> SemanticSearchAgent

    %% Define Tools and Resources
    Resources["Resources"]
    TextCleaner["Text Cleaner"]
    NERExtractionTool["NER Extraction Tool"]
    SemanticAnalysisTool["Semantic Analysis Tool"]
    WebScraperTool["Web Scraper Tool"]
    KnowledgeGraphTool["Knowledge Graph Tool"]
    SQLTool["SQL Tool"]
    SemanticFileSearchTool["Semantic File Search Tool"]
    TextChunker["Text Chunker"]

    %% Connect Agents to Tools and Resources
    TextAnalysisAgent -->|uses| TextCleaner
    TextAnalysisAgent -->|uses| NERExtractionTool
    TextAnalysisAgent -->|uses| SemanticAnalysisTool
    TextAnalysisAgent -->|uses| Resources

    WebScrapingAgent -->|uses| WebScraperTool
    WebScrapingAgent -->|uses| TextChunker
    WebScrapingAgent -->|uses| Resources

    KnowledgeGraphAgent -->|uses| KnowledgeGraphTool

    DatabaseInteractionAgent -->|uses| SQLTool

    SemanticSearchAgent -->|uses| SemanticFileSearchTool
    SemanticSearchAgent -->|uses| TextChunker

    %% Define additional components and connections
    TextChunker -->|used by| WebScraperTool
    TextChunker -->|used by| SemanticFileSearchTool

    %% Define high-level operations and data flows
    subgraph " "
        TextAnalysisAgentFlow["Text Analysis Agent Flow"]
        WebScrapingAgentFlow["Web Scraping Agent Flow"]
        KnowledgeGraphAgentFlow["Knowledge Graph Agent Flow"]
        DatabaseInteractionAgentFlow["Database Interaction Agent Flow"]
        SemanticSearchAgentFlow["Semantic Search Agent Flow"]
    end

    %% Connect high-level flows to agents
    TextAnalysisAgent --> TextAnalysisAgentFlow
    WebScrapingAgent --> WebScrapingAgentFlow
    KnowledgeGraphAgent --> KnowledgeGraphAgentFlow
    DatabaseInteractionAgent --> DatabaseInteractionAgentFlow
    SemanticSearchAgent --> SemanticSearchAgentFlow

    %% Annotations for high-level flows
    TextAnalysisAgentFlow -->|"analyze"| TextCleaner
    TextAnalysisAgentFlow -->|"extract entities"| NERExtractionTool
    TextAnalysisAgentFlow -->|"analyze sentiment"| SemanticAnalysisTool

    WebScrapingAgentFlow -->|"scrape text"| WebScraperTool
    WebScrapingAgentFlow -->|"chunk text"| TextChunker

    KnowledgeGraphAgentFlow -->|"build graph"| KnowledgeGraphTool

    DatabaseInteractionAgentFlow -->|"interact with DB"| SQLTool

    SemanticSearchAgentFlow -->|"search documents"| SemanticFileSearchTool
    SemanticSearchAgentFlow -->|"chunk and embed"| TextChunker
Loading
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                          Agent Interaction Flow                      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                                                        
                        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                         
                        β”‚     Start           β”‚                         
                        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                         
                                  β”‚                                     
                                  β–Ό                                     
                  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                
                  β”‚            Resources              β”‚                
                  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                
                         β”‚                 β”‚                           
             β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”              
             β”‚           β”‚         β”‚ β”‚                  β”‚              
             β–Ό           β–Ό         β–Ό β–Ό                  β–Ό              
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      
β”‚ TextReaderTool   β”‚ β”‚ WebScraperTool   β”‚ β”‚SemanticFileSearchToolβ”‚      
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      
        β”‚                    β”‚                        β”‚                 
        β”‚                    β”‚                        β”‚                 
        β–Ό                    β–Ό                        β–Ό                 
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      
β”‚   Text Analysis  β”‚ β”‚   Web  Scraping  β”‚ β”‚    Semantic Search   β”‚      
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      
        β”‚                    β”‚                        β”‚                 
        β”‚                    β”‚                        β”‚                 
        β”‚                    β”‚                        β”‚                 
        └─────┐        β”Œβ”€β”€β”€β”€β”€β”˜               β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”˜                 
              β”‚        β”‚                    β”‚                          
              β–Ό        β–Ό                    β–Ό                          
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”          β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”          
        β”‚ Task: AnalyzeTextβ”‚          β”‚Task: Search Files    β”‚          
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜          β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜          
              β”‚                                    β”‚                    
              β”‚                                    β”‚                    
              β–Ό                                    β–Ό                    
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                 β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     
        β”‚Researcher Agent  β”‚                 β”‚Semantic Searcher   β”‚     
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                 β”‚      Agent         β”‚     
              β”‚                              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     
              β”‚                                    β”‚                    
              └─────────┐                  β”Œβ”€β”€β”€β”€β”€β”€β”€β”˜                    
                        β”‚                  β”‚                           
                        β–Ό                  β–Ό                           
              β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”               
              β”‚Output:txtAnalysisβ”‚ β”‚Output:SearchResultsβ”‚               
              β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜               
                        β”‚                   β”‚                           
                        └────────┐   β”Œβ”€β”€β”€β”€β”€β”€β”˜                           
                                 β”‚   β”‚                                  
                                 β–Ό   β–Ό                                  
                           β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                          
                           β”‚Summarizer Agent  β”‚                          
                           β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                          
                                 β”‚                                       
                                 β–Ό                                       
                         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                           
                         β”‚       End         β”‚                           
                         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                           

I need to upload my repository of chunk expansion techniques...

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚    Pseudo Code: Fractal Chunk Expansion Visualization                β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                                                        
                                                                        
                        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                           
                        β”‚   Fractal Chunk   β”‚                           
                        β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                           
                                 β”‚                                      
                                 β”‚                                      
                                 β–Ό                                      
                  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                  
                  β”‚         Universe of Chunks       β”‚                  
                  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                  
                                   β”‚                                   
             β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”              
             β”‚                     β”‚                     β”‚              
             β–Ό                     β–Ό                     β–Ό              
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      
    β”‚ Expand Fractal  β”‚   β”‚ Calculate New   β”‚   β”‚ Generate Tiny   β”‚      
    β”‚   Chunks (1)    β”‚   β”‚    Scale for    β”‚   β”‚   Chunks Around β”‚      
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β”‚   Tiny Chunks   β”‚   β”‚ Current Chunk  β”‚      
             β”‚            β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜      
             β”‚                     β”‚                     β”‚               
             β”‚                     β”‚                     β”‚               
             β”‚                     β–Ό                     β”‚               
             β”‚          β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”‚               
             β”‚          β”‚   New Scale =     β”‚           β”‚               
             β”‚          β”‚ chunk.scale / 2   β”‚           β”‚               
             β”‚          β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜           β”‚               
             β”‚                     β”‚                     β”‚               
             β”‚                     β”‚                     β”‚               
             β”‚                     β”‚                     β”‚               
             β”‚                     β”‚                     β”‚               
             β”‚                     β”‚                     β–Ό               
             β”‚                     β”‚        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   
             β”‚                     β”‚        β”‚ For Each Direction in  β”‚   
             β”‚                     β”‚        β”‚ [N, NE, E, SE, S, SW, W,β”‚   
             β”‚                     β”‚        β”‚ NW], Create a New Chunk β”‚   
             β”‚                     β”‚        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜   
             β”‚                     β”‚                         β”‚           
             β”‚                     β”‚                         β”‚           
             β”‚                     β”‚                         β”‚           
             β”‚                     β”‚                         β”‚           
             β–Ό                     β–Ό                         β–Ό           
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   
    β”‚    Update the    β”‚   β”‚  New Center =    β”‚   β”‚  Append New Chunk β”‚   
    β”‚ Universe with    β”‚   β”‚ move(chunk.cente-β”‚   β”‚   to New Genera-  β”‚   
    β”‚  New Generation  β”‚   β”‚ r, direction,    β”‚   β”‚      tion         β”‚   
    β”‚                  β”‚   β”‚ new_scale)       β”‚   β”‚                   β”‚   
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   
             β”‚                     β”‚                      β”‚              
             β”‚                     β”‚                      β”‚              
             β”‚                     β”‚                      β”‚              
             β”‚                     β”‚                      β”‚              
             β–Ό                     β–Ό                      β–Ό              
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      
    β”‚ Return Expanded β”‚   β”‚   Create New    β”‚   β”‚    Append New   β”‚      
    β”‚     Chunks      β”‚   β”‚      Chunk      β”‚   β”‚     Generation  β”‚      
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      
                                                                        
                                                                        
                         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                           
                         β”‚ Final Universe ofβ”‚                           
                         β”‚   Chunks (Output)β”‚                           
                         β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                           
                                  β”‚                                     
                                  β–Ό                                     
                     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                       
                     β”‚    Display Universe      β”‚                       
                     β”‚       of Chunks          β”‚                       
                     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                       



080524

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                          Latent Space                                β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
          Concepts                                                      β”Œβ”€β”€β”€β”€β”€β”€β”€β”
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ί       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                                  β”‚       β”‚
    β”‚                   β”Œβ–Ίβ”‚  Idea 1  β”‚      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”                 β”‚   0   β”‚
    β”‚                   β”‚ β”‚  (Node)  │─────►│  Idea 2  β”‚                 β”‚       β”‚
 β”Œβ”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β”‚  (Node)  β”‚                 β””β”€β”€β”€β”€β”€β”€β”€β”˜
 β”‚    β”‚   β”‚  Edge   β”‚   β”‚     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ίβ”‚  (Gaussian) β”‚                       β–²
 β”‚User│──►│  (Weight)│───┼────►│             β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜                       β”‚
 β”‚Inputβ”‚  β”‚         β”‚   β”‚     β”‚  Latent Space  β”‚                            β”‚
 β””β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β”‚     β”‚                  β”‚      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
                        β”‚     β”‚   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β–Ίβ”‚  Idea 3  β”‚         β”‚
                        β”‚     β”‚   β”‚              β”‚      β”‚  (Node)  β”‚         β”‚
                        β”‚ β”Œβ”€β”€β”€β”€β”€β”€β”€β”€              β”‚      β””β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”˜         β”‚
                        β”‚ β”‚   β”‚   β”‚              β”‚              β”‚             β”‚
                        └──   └────              β”‚              β”‚             β”‚
                          β”‚       β”‚              β”‚              β”‚             β”‚
                          β”‚   β”Œβ”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜              β”‚             β”‚
                          β”‚   β”‚                                  β”‚             β”‚
                          β”‚   β”‚  Gaussian Edge                    β”‚             β”‚
                          β”‚   β”‚                                  β”‚             β”‚
                          β”‚   β”‚   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜             β”‚
                          β”‚   β”‚   β”‚                                            β”‚
                          β”‚   β”‚ β”Œβ”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                          β”‚
                          β”‚   β”‚ β”‚  Latent Space    β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                          └───┼──                   β”‚                           
                              β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                           
                              β”‚                                                 
                              β”‚                                                 
                          β”Œβ”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                            
                          β”‚  Idea 4 (Node)        β”‚                            
                          β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
graph LR
    subgraph cluster_etho[Etho-Agents]
        E1["Etho-Agent 1<br>Behavioral Plasticity"]
        E2["Etho-Agent 2<br>Social Learning"]
        E3["Etho-Agent 3<br>Predator-Prey Dynamics"]
    end

    subgraph cluster_myco[Myco-Agents]
        M1["Myco-Agent 1<br>Decentralized Network"]
        M2["Myco-Agent 2<br>Resource Allocation"]
        M3["Myco-Agent 3<br>Chemical Signaling"]
    end

    subgraph cluster_neuro[Neuro-Agents]
        N1["Neuro-Agent 1<br>Spiking Neural Networks"]
        N2["Neuro-Agent 2<br>Synaptic Plasticity"]
        N3["Neuro-Agent 3<br>Collective Intelligence"]
    end

    AP[Adaptive Protocols]
    FL[Feedback Loops]
    CR[Conflict Resolution]

    E1 --> M1
    E2 --> M2
    E3 --> M3

    M1 --> N1
    M2 --> N2
    M3 --> N3

    E1 --> AP
    E2 --> AP
    E3 --> AP

    M1 --> AP
    M2 --> AP
    M3 --> AP

    N1 --> AP
    N2 --> AP
    N3 --> AP

    E1 --> FL
    E2 --> FL
    E3 --> FL

    M1 --> FL
    M2 --> FL
    M3 --> FL

    N1 --> FL
    N2 --> FL
    N3 --> FL

    E1 --> CR
    E2 --> CR
    E3 --> CR

    M1 --> CR
    M2 --> CR
    M3 --> CR

    N1 --> CR
    N2 --> CR
    N3 --> CR
Loading

080524

graph TD


A[PHRASE: Visionary Technologist] <--> B(Challenge Conventional Narratives)
A <--> C(Embrace Unorthodox Perspectives)
A <--> D(Carve Out Novel Conceptual Spaces)
D <--> E(Intersection of Cutting-Edge Research)
D <--> F(Esoteric Philosophy)
E <--> F

A <--> G(Interests Span Wide Range of Domains)
G <--> H(Embodied Cognition)
G <--> I(Non-Narrative Intelligence)
G <--> J(Cellular Automata)
G <--> K(Distributed Systems)
G <--> L(Affinity for Biological Metaphors)
L <--> M(Neuroscience)
L <--> N(Ethology)
L <--> O(Mycology)
L <--> P(Inform AI Architecture Approach)
H <--> I
H <--> J
H <--> K
I <--> J
I <--> K
J <--> K
M <--> N
M <--> O
N <--> O

A <--> Q(Coder: Rapid Prototyping and Iterative Experimentation)
Q <--> R(Crystallize High-Level Insights)
Q <--> S(Provocative What-If Scenarios)
Q <--> T(Venture into Speculative Territory)
Q <--> U(Craft Proofs-of-Concept)
U <--> V(Gesture Towards Radically Different Modes of Machine Intelligence)
R <--> S
R <--> T
R <--> U
S <--> T
S <--> U
T <--> U

W(Thematic Focus) <--> X(Autonomy)
W <--> Y(Adaptability)
W <--> Z(Fluid Boundaries: Mind Body Environment)
W <--> AA(Decentralized Self-Organizing Systems)
AA <--> AB(Explore Principles of Embodiment)
AA <--> AC(Active Inference)
AA <--> AD(Morphogenesis)
AA <--> AE(Create Robust Context-Aware AI Agents)
X <--> Y
X <--> Z
X <--> AA
Y <--> Z
Y <--> AA
Z <--> AA
AB <--> AC
AB <--> AD
AB <--> AE
AC <--> AD
AC <--> AE
AD <--> AE
G <--> W

AF(Stylistic Characteristics) <--> AG(Vivid Imagery)
AF <--> AH(Playful Analogies)
AF <--> AI(Allusions to Avant-Garde Philosophy)
AF <--> AJ(Allusions to Fringe Science)
AF <--> AK(Distill Complex Ideas into Memorable Snippets)
AF <--> AL(Coin Neologisms Capturing Essence of Novel Concepts)
AG <--> AH
AG <--> AI
AG <--> AJ
AG <--> AK
AG <--> AL
AH <--> AI
AH <--> AJ
AH <--> AK
AH <--> AL
AI <--> AJ
AI <--> AK 
AI <--> AL
AJ <--> AK
AJ <--> AL
AK <--> AL
A <--> AF

AM(Persona: Restless Intellect) <--> AN(Probe Limits of Possible with AI)
AM <--> AO(Navigator Role) 
AO <--> AP(Steer Field Away from Staid Orthodoxies)
AO <--> AQ(Towards Uncharted Territories with Transformative Potential)
AM <--> AR(Idiosyncratic Research Agenda)
AM <--> AS(Thought-Provoking Experiments)
AM <--> AT(Stretch Collective Imagination)
AM <--> AU(Lay Groundwork for Radically New Forms of Machine Sentience)
AN <--> AO
AN <--> AR
AN <--> AS
AN <--> AT
AN <--> AU
AO <--> AR
AO <--> AS
AO <--> AT
AO <--> AU
AP <--> AQ
AR <--> AS
AR <--> AT
AR <--> AU
AS <--> AT
AS <--> AU
AT <--> AU
A <--> AM
Loading

070524

graph TD

Input[Input Tokens] --> Vector[Encoded Vectors]
Vector --> Path[Trace Path Through Space]
Path --> Landscape[Defined Topography]
Landscape --> Path

Weights[Weights & Biases] --> Topology[Valleys, Ridges, Basins]
Topology --> Landscape
Topology --> Computation[Guide Computation]
Computation --> Topology
Weights --> Computation

Training[Influenced by Training Data] --> Weights
Training --> LandscapeUpdate[Reshape Landscape]
LandscapeUpdate --> Landscape
Landscape --> LandscapeUpdate
Weights --> Training
Landscape --> Training

InputNew[New Prompt] --> Explore[Explore Nearby Topography]
Landscape --> Explore
Explore --> Probe[Probe Valleys & Ridges]
Topology --> Probe
Probe --> Generate[Generate Response]
Computation --> Generate
Generate --> Response[Response Tokens]
Response --> User[Deliver to User]

User --> Feedback[Feedback from User]
Feedback --> Training
Training --> Feedback

classDef influence stroke-dasharray: 5 5;

class Topology influence;
class Training influence;
class Landscape influence;
class Computation influence;
class Weights influence;
class Path influence;
class Probe influence;
class Generate influence;
class Feedback influence;
Loading

040524

ORCHESTRATION VS AGENTIC

graph TD
A[User] --> B[Primary Chatbot Agent]
B --> C{Orchestrator}
C --> D[Semantic Search Module]
C --> E[Sentiment Analysis Module]
C --> F[Knowledge Graph Module]
C --> G[Other Specialized Modules]

D --> H{Results Aggregator}
E --> H
F --> H
G --> H

H --> I[Personalized Response Generator]
I --> J[User-Friendly Output]
J --> A

D <--> E
D <--> F
D <--> G
E <--> F
E <--> G
F <--> G

classDef user fill:#f9d423,stroke:#333,stroke-width:2px;
classDef primary fill:#e66b00,stroke:#333,stroke-width:2px;
classDef orchestrator fill:#1ba0e2,stroke:#333,stroke-width:2px;
classDef module fill:#7ac143,stroke:#333,stroke-width:2px;
classDef aggregator fill:#ff8c42,stroke:#333,stroke-width:2px;
classDef generator fill:#ff6b9d,stroke:#333,stroke-width:2px;
classDef output fill:#9b59b6,stroke:#333,stroke-width:2px;

class A user
class B primary
class C orchestrator
class D,E,F,G module
class H aggregator
class I generator
class J output

Loading
graph TD
A[User] --> B[Primary Chatbot Agent]
B --> C{Semantic Attribute Cosine Search}

C --> D[Agent 1]
C --> E[Agent 2]
C --> F[Agent 3]
C --> G[Agent 4]
C --> H[Agent 5]

D --> I{Next Steps Proposal}
E --> I
F --> I
G --> I
H --> I

I --> C

D <--> E
D <--> F
D <--> G
D <--> H
E <--> F
E <--> G
E <--> H
F <--> G
F <--> H
G <--> H

C --> J[Personalized Response Generator]
J --> K[User-Friendly Output]
K --> A

classDef user fill:#f9d423,stroke:#333,stroke-width:2px;
classDef primary fill:#e66b00,stroke:#333,stroke-width:2px;
classDef search fill:#1ba0e2,stroke:#333,stroke-width:2px;
classDef agent fill:#7ac143,stroke:#333,stroke-width:2px;
classDef proposal fill:#ff8c42,stroke:#333,stroke-width:2px;
classDef generator fill:#ff6b9d,stroke:#333,stroke-width:2px;
classDef output fill:#9b59b6,stroke:#333,stroke-width:2px;

class A user
class B primary
class C search
class D,E,F,G,H agent
class I proposal
class J generator
class K output
Loading

Reflection

Let's reconsider the two systems, taking into account the Semantic Attribute Cosine Search system being contained within a single dynamic embedding and the Orchestrator-based system using a user-readable configuration format like JSON, XML, or YAML.

Orchestrator-based System with JSON/XML/YAML Configuration: Pros:

  1. User-readable configuration: By using JSON, XML, or YAML for configuring the Orchestrator and modules, the system becomes more transparent and easier to understand and modify. Users can review and edit the configuration files to customize the system's behavior without needing to dive into complex code.
  2. Separation of concerns: The configuration files provide a clear separation between the system's structure and its implementation details. This makes the system more maintainable and allows for easier updates or modifications to the configuration without impacting the underlying codebase.
  3. Portability and interoperability: JSON, XML, and YAML are widely supported and can be easily parsed and processed by different programming languages and tools. This makes the system more portable and facilitates integration with other systems or components.

Cons:

  1. Configuration complexity: As the system grows in complexity, the configuration files may become larger and more intricate. Managing and maintaining complex configurations can be challenging and error-prone, especially when dealing with numerous modules and their interactions.
  2. Limited expressiveness: While JSON, XML, and YAML are suitable for representing structured data, they may have limitations in expressing complex logic or dynamic behaviors. Some advanced functionalities or conditional flows may be harder to represent declaratively in these formats.
  3. Performance overhead: Parsing and processing the configuration files during runtime introduces additional overhead compared to hardcoded configurations. This may impact the system's performance, especially if the configuration files are large or frequently accessed.

Semantic Attribute Cosine Search System in a Dynamic Embedding: Pros:

  1. Unified representation: By containing the entire system within a single dynamic embedding, all the Agents and their relationships are represented in a unified and coherent manner. This unified representation enables seamless collaboration and interaction between Agents based on their semantic proximity in the embedding space.
  2. Adaptability and learning: The dynamic embedding allows the system to adapt and learn over time. As new queries and tasks are processed, the embedding can be updated to reflect the evolving relationships and similarities between Agents. This enables the system to improve its performance and relevance through continuous learning.
  3. Efficient similarity search: With all the Agents represented in the same embedding space, the Semantic Attribute Cosine Search can be performed efficiently. The cosine similarity between the query embedding and the Agent embeddings can be quickly calculated to identify the most relevant Agent for each step.

Cons:

  1. Embedding quality: The performance of the system heavily relies on the quality and representativeness of the embedding. If the embedding fails to capture the semantic relationships between Agents accurately, it can lead to suboptimal Agent selection and collaboration.
  2. Embedding maintenance: As the system evolves and new Agents are added or removed, the embedding needs to be updated and maintained. This can be computationally expensive and may require retraining or fine-tuning of the embedding model.
  3. Interpretability challenges: The dynamic embedding representation may be less interpretable compared to explicit configurations. Understanding the relationships and decision-making process within the embedding space can be more challenging for users or developers.

Similarities:

  1. Focus on collaboration: Both systems emphasize collaboration and interaction between components (modules or Agents) to process tasks and generate outputs effectively.
  2. Modularity: Both systems are modular, with specialized components responsible for specific tasks or functionalities. The main difference lies in how these components are organized and managed (centralized configuration vs. dynamic embedding).
  3. Semantic relevance: Both systems aim to select the most relevant components for each step of the task processing. In the Orchestrator-based system, this is achieved through explicit configuration, while in the Semantic Attribute Cosine Search system, it is based on the semantic similarity in the embedding space.

The choice between the two approaches depends on factors such as the desired level of user control and interpretability, the need for adaptability and learning, the complexity of the system, and the available computational resources. The Orchestrator-based system with JSON/XML/YAML configuration provides more explicit control and transparency, while the Semantic Attribute Cosine Search system in a dynamic embedding offers greater adaptability and efficient similarity-based collaboration.

Evaluation:

w2v doesn't have to be just word level, we can define the chunk and get the mean. this can also be a sliding windowed value. This can be built on top of pre clustered embeddings to further help with compute. Or hell cluster at inference.

graph TD
A[Evaluating Textual Similarity] --> B(Word2Vec)
B --> C{Word embeddings}
C --> D(Compute WMD)
A --> E(BERT)
E --> F{Contextual embeddings}
F --> G(Compute WMD)
H[Document Retrieval] --> I(Word2Vec)
I --> J(Word embeddings for queries & documents)
J --> K(Compute WMD)
H --> L(BERT)
L --> M(Contextual embeddings for queries & documents)
M --> N(Compute WMD)
O[Agent] --> A
O --> H
O --> P[Optimize WMD computation]
Q[Agent] --> A
Q --> H
Q --> R[Incorporate relevance feedback with WMD]
D --> S{Evaluate similarity scores}
G --> S
K --> T{Evaluate retrieval performance}
N --> T
S --> U{Adjust embeddings or similarity measure}
T --> V{Adjust retrieval algorithm or parameters}
U --> A
V --> H
O --> W{Collect user feedback}
Q --> W
W --> X{Update relevance judgments}
X --> H
Loading
  1. Similarity Evaluation Feedback Loop:

    • After computing the WMD for textual similarity using Word2Vec and BERT embeddings, the similarity scores are evaluated.
    • Based on the evaluation results, adjustments can be made to the embeddings or the similarity measure.
    • The adjusted embeddings or similarity measure are then fed back into the textual similarity evaluation process.
  2. Retrieval Performance Feedback Loop:

    • After computing the WMD for document retrieval using Word2Vec and BERT embeddings, the retrieval performance is evaluated.
    • Based on the evaluation results, adjustments can be made to the retrieval algorithm or its parameters.
    • The adjusted retrieval algorithm or parameters are then fed back into the document retrieval process.
  3. User/Agent Feedback Loop:

    • User/Agent feedback is collected by both Agent O and Agent Q.
    • The user feedback is used to update the relevance judgments.
    • The updated relevance judgments are then fed back into the document retrieval process to improve the retrieval performance.

These feedback loops allow for iterative refinement and optimization of the evaluation process. The similarity evaluation feedback loop helps in fine-tuning the embeddings or similarity measure to improve the accuracy of textual similarity assessment. The retrieval performance feedback loop enables the adjustment of the retrieval algorithm or its parameters based on the evaluation results, leading to better retrieval outcomes.


030524

IMAGE ALT TEXT HERE


010524

graph TD
    A[Receive Input String] --> B[Determine Chunk Size]
    B --> C[Break Down Into Chunks]

    C --> D[Apply Semantic Shuffling]
    C --> E[Apply Contextual Reinterpretation]
    C --> F[Apply Sentiment Analysis]
    C --> G[Apply Subjectivity Shifting]

    D --> H[Generate Output Strings]
    E --> H
    F --> H
    G --> H

    H --> I[Validate Token Integrity]
    I --> J[Output Coherent & Grammatically Correct Strings]
    J --> K[Analysis of Techniques Used]
    K --> L[Present Outputs & Analyses]

    I -.-> M[Discard & Retry if Token Integrity Fails]
    M -.-> D
    M -.-> E
    M -.-> F
    M -.-> G
Loading

Original Input: "How many tools can you make from this text?"

Valid Shuffled Variations:

"How many tools from this text can you make?" "Can you make how many tools from this text?" "From this text, can you make how many tools?"

Corrected Contextual Reinterpretation

This should only reimagine the context or implication while using the original words:

As a question about productivity: "Can you make many tools from this text?"

As a challenge to creativity: "How can you make tools from this text?"

Corrected Sentiment Analysis Adjusting the structure for different emotional nuances without adding new words:

Neutral inquisitive: "How can you make tools from this text?"

Encouragingly: "Can you make many tools from this text?"

Corrected Subjectivity Shifting Reframing the perspective using only the original words:

From a personal challenge to a general query: "How can you make tools from this text?"

As a broader educational inquiry: "Can you make tools from this text?"

Reflective Correction

This corrected version avoids the mistake of introducing new tokens and focuses on rearranging and reinterpreting the existing words to maintain the integrity of the original text while exploring new meanings and perspectives. Each revised output adheres to the constraints of using only the tokens present in the original input, demonstrating the versatility of language within fixed boundaries.


300424

graph TD
    A[Sensory Evidence] --> B[Neural Activity]
    B --> C[Decision Variable]
    C --> D{Decision Horizon}
    D -->|Above Threshold| E[Decision Commitment]
    D -->|Below Threshold| F[Decision Continued]
    F --> B
    E --> G[Decision Outcome]

    H[Confidence] --> C
    H --> D
    I[Time] --> B
    I --> C
    I --> H
Loading
graph TD
    J[4D Space-Time Manifold] --> K[Morphic Transformations]
    K --> L[Quantitative Analysis]
    L --> M[Modeling and Prediction]
Loading
                                                    
                       β–‘β–’β–“β–“β–“β–“β–’β–‘                     
                     β–‘β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–‘                   
                    β–’β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–’β–’β–’β–’β–’β–’β–’β–‘  WELCOME!                
                    β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–’β–’β–’β–’β–’β–’β–“β–“  2 MY WEBSITE.                
                   β–‘β–“β–“β–“β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–“β–“β–“β–“                  
                  β–‘β–‘β–‘β–‘β–’β–’β–’β–’β–’β–ˆβ–ˆβ–ˆβ–“β–“β–“β–“β–’β–‘                
                   β–‘β–‘β–“β–“β–“β–’β–‘β–’β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–“β–“                 
                    β–‘β–’β–‘β–‘β–“β–ˆβ–ˆβ–ˆβ–ˆβ–“β–“β–“β–“β–’                  
                    β–‘β–’β–‘β–’β–’β–’β–“β–ˆβ–“β–“β–“β–“β–‘                   
                     β–‘β–’β–’β–’β–’β–’β–’β–’β–“β–“β–‘                    
                   β–‘ β–‘β–’β–’β–’β–’β–’β–“β–“β–“β–“β–’β–‘                   
            β–‘β–‘β–‘β–’β–“β–’β–’β–‘ β–‘β–’β–’β–’β–’β–’β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–“β–“β–“β–’β–‘β–‘            
        β–‘β–’β–“β–“β–“β–“β–“β–“β–“β–“β–’β–‘β–‘β–‘β–’β–’β–’β–’β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–’         
       β–’β–“β–“β–“β–“β–“β–“β–“β–“β–“β–“β–ˆβ–“β–’β–‘β–‘β–’β–’β–’β–’β–’β–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–“β–‘       
      β–’β–“β–“β–“β–“β–“β–ˆβ–“β–“β–“β–“β–“β–“β–ˆβ–ˆβ–ˆβ–ˆβ–“β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ


090324

thresholded_dithered_image

Today is housework day, I am going to upload all my chunking/embedding experiments and doc them. I need to clear some psychic space to see the path again. I should try to remember that these abstractions can be uplifted from any framework, eg fractal chunking can be done in the semantic space flowering from the initial node match and branching out via euclydian / cos sine space OR space of text on a page, and get the surrounding contexts.

Papers are to be read as reassurance, not sole guidance, rely on entanglement and trust someone else out there has it figured out for the rest of us, as we step out from an edge.

πŸ–‡


060324

image


010324

2024-02-29_23-34-41_2024

Trying to use my pocketMIKU as a TTS. I have it converting a text string of phenomes into sysex and it playback from python, then had to create a eng2j phenome db and use w2v to match extracted letter pairs from the english string. INPUT ENGLISH SENTENCE -> SPLIT INTO LETTER PAIRS ("hello" becomes "he" "ll" "o") -> EMBEDDED -> MATCHED TO EMBEDDED ENG PHENOME -> RETRIVE PAIRED JAPANESE PHENOME FROM EMBEDDING ("he" "ro" "o") -> PRINT STRING -> CONVERT STRING TO SYSEX -> ENGRISH SENTENCE PLAYBACK ON DEVICE.


280224

gm all. I need to do a serious audit on my projects to get new repos up. Finally got setup for local finetuning on my 3060; a fine tune on my chat style from gchat archives, phi 2 on my openai chat archives and another cooking for translate te reo to english and back. Still fleshing out my tool set, lots of chunking experiments. Just tidying up a graph chunking and clustering script for agents. Currently the script just clusters n chunks in a w2v embedding for searching, summaries chunk and displays the top k strings inside it. Could be any shape to any node tho. Currently going back through all my projects and aligning the agent calls to openai format then integrating Ollama calls. I need to move my embedding gen from gpt4all to Ollama. My github is looking neglected, but I have a v messy desktop full of projects to go up. Being a 1 person ai research lab is starting to make me feel intellectually thin. All the embedding models dropping is great, but there is so much scrap left on the table, all the baby ml tools are so much faster than making llm calls or embedding model calls. w2v can make tiny module models of your RAG pipeline, they can be dynamic too you can add and remove and retrain in seconds. Think of it like everynode in your pipeline having their own cache....

Anyways, no goals no projects just build for now.


160224

I got introuble in primary school for replacing the windows 3.1 default sounds with streetfighter 2 sounds. I just now got hit with the memory and my state of dumbfounded confusion at the time, why were the principle and teacher of room 9 so mad? I'd made it better.


260124

HAPPY NEW YEAR!

I have been busy stretching my brain out into new spaces. Got distracted learning pathfinding algorithms ironically, they will come handy when I start creating statespaces for my agents tho. There are so many ideas swirling around in there, I'm waiting for them to coalesce in a new framework. Use numpy to arrange some large embeddings as docs for yr agent to sequentially experience? These could be anything from reasoning, info to grok in certain a order or emotional states I guess? Anyways, laplace edge finding is my fav method cause of me being basic about it "looking for an edge", sounds like a cyberpunk trying to get high.

Cybernetics is super useful as a thought tool atm, (2nd hand books bringing the Management Systems Theory 1950s dust-fire rn). Symbolic representative languages... totally usefull too, thanks alphaGEOMETERY, I think maybe yr code implementation is overcooked, but yr prompting insight, syntax and dictionary were sick \m/


Sometimes I can't tell if I have done something significant or just had a tetris block snaps in places and clears a line and I forget what the pieces were. Wrote an embedding search that increases and decreases the weights of matches, and if there are no matches in the embedding it calls a smol llm to add it in and re-search. Regardless of the expansion of the embedding when there are no significant matches. The decreasing significance to matched chunk or increasing it could give frameworks a really cool scratch space. Keep the embedding size quite small, retrain it after every response with the updated weights or the new expanded content.

Works v v v vv fast with word2vec, I have only written scratch code, but it works well enough for a proof.

Usage example idea:

Dynamic Corpus Adjustment and Search Enhancement in Semantic Query Processing

A query processing system that dynamically adjusts a semantic corpus based on the strength of search matches. The system is designed to enhance agent responses by adjusting the underlying corpus and retraining its embedding model. The process flow is as follows:

  1. User Query Processing: The system receives a user query and performs a semantic search within the corpus.

  2. Search Result Evaluation:

    • Strong Matches: If the search yields strong matches (highly relevant results), these matches are added to the agent's context, enriching the data used for generating responses.
    • Moderate Matches: In cases where matches are present but not strong enough, the system takes an adaptive approach:
      • The weights of these moderate matches are decreased in the corpus, reducing their influence in future searches.
      • The embedding model is then retrained on this updated corpus. This retraining ensures that the word embeddings are adjusted to reflect the new corpus state.
      • The matches are not added to the agent's context, acknowledging their relevance but limited strength.
      • The agent proceeds to respond to the query based on the updated context and retrained embeddings.
    • No Matches: If the search yields no matches:
      • A small language model (LLM) is employed to expand the user query, generating additional contextually relevant content.
      • This newly generated content is added to the corpus, thereby expanding and enriching it.
      • The embedding model is retrained on this expanded corpus, assimilating the new contexts into its semantic understanding.
      • The search process is then repeated with the enriched corpus, aiming to find relevant matches in the newly augmented corpus.

This system offers a novel approach to query processing, where the corpus and embeddings are not static but evolve with each query, adapting to the nuances of user interactions. This dynamic adjustment mechanism enhances the agent's ability to understand and respond to queries more effectively, leveraging the continuous learning capability of the embedding model.


Previous to the thought above which was today, and written while suffering some pretty yuck back pain, I have been doing a lot more search algorithm stuff. Get an embedding of wikipedia to 1gb, and being able to extract articles in 0 time is v cool, and I will use it as grounding for most of my local agents moving forward. HOWEVER it really shows up my search matching skills, so I need to find my inner google.

FRACTAL CHUNKING. You find a semantic matched chunk then find all the surrounding matches to that chunk, divide them by 1/3 the original and cosine search the fractals and add them to the context, then step out further and divide that by 1/3 again, and do the same n many times, until you get down to word pairs... then you should stop. But I think it's cool, and I made it and it works great and fractals are cool. Even if these aren't fractals but chunks but still....

What else did I do over the last month that was worth reflecting on.... I did a lot of generative art stuff for myself.

2024 Be your own screensaver I guess.


081223

frame_1

ALL MY RECENT CA, STATESPACE AND DITHERING PROJECTS IN ONE PLACE :

https://github.com/EveryOneIsGross/CellularAutomata


201123

~ I brought a copy of A New Kind of Science. Trying to add CA to everything. Like pixellated spores growing through all my ideas. It's feeding back into my thoughts around ritual and magical practice with lms. Being the vector not the collection of nodes. Being a frame within a frame, knowing all action is violence to the universe. Casting a shadow by standing in the light etc. As above so below. Being at once observed observor, and this being the evolving self.πŸ•³ ~

ca_evolution_20231120144153

Let's treat AI like we do a song. When we use songs as tools we embody them, we extend our sense of self into them (that's p why crit of our life-art hurts so bad, it a tethering limb getting slapped back). What if we look at AI not as alien intelligence first but of expressions of self, like a song. Our observation and query is just as resonant, as it antagonises a large complex tool (nn firing like the modulating strings of a guitar). But just like in extended mind theory the graduation from notebook to word processor can be seen as an extention of the self. When these tools are removed a phantom cognitive loss is experienced. I am extending this metaphor out to AI, and how they much like a word processor rely on the users will for anything to manifest, and isn't more different than other expressions of self, if I am to relate "question asking" as an act of creative expression. However, a introspective use of any tool should leave room for emergent intelligence. Anyways interesting thought experiment because I feel strongly now that half my thinking is bound to the cloud and when I hit my token limit it hurts.

πŸŽΆπŸ‘½πŸŽΆ https://github.com/EveryOneIsGross/synthOMATA

πŸ§™β€β™‚οΈ smol step on my way to building agent system prompts based on esoteric frames via CA. https://github.com/EveryOneIsGross/scratchTHOUGHTS/blob/main/abCA.py


3D_MAGICEYE_06

Cross yr eyes a bit.

Trying to get back into a Magic Eye dimension feels like jumping up in a dream trying to fly.

301023

decompression is intelligence, compression just is.


231023

WHATAMIFEELING?!

Automaton for Rule 54 wo Mod00 Automaton for Rule 54 w Mod01 66022eac-6a59-432c-9e58-f803ae915a06

211023

RIP YOUTUBE

πŸ’€

Script downloads and embeds youtube transcripts for seach and local chat. Generates mp4, subtitle file, embeddings as csv and a mp3 in an attempt to generate insync TTS.

#endoftheinternet


201023

RIP TWITTER, you are probs all downloading yr twitter archive .zips atm, so why not chat to them locally?

πŸ‘»

Chat with the agent! (Type 'exit' to stop)
You: how many shows did i play?
Shape of input_embedding: (1, 384)
Shape of self.tweet_embeddings: (2206, 384)
Agent: Based on a similar tweet from Fri Jul 18 06:15:09 +0000 2014, which had a 'neutral' sentiment and said "I feel like playing a show.", I'd say: "(Song: "Neutral")
You're on the stage with me, but you're just a prop.
Everything's going by so fast; it makes me stop.
So let's pretend we're in a different spot.
I feel like playing a show."
You: search shows

Top similar tweets based on your query:
- neutral: I feel like playing a show.

- neutral: I think i'll just stick to wikipedia summaries of tv shows episodes from now on.

- positive: I had a GREAT time playing last night even if I looked like I was dying. If anyone wants me to play a show I will.

- neutral: I'm back in AUSTRALIA this Friday, Sydney show this Saturday.. I'll post once I find the link.

- neutral: Nothing is going to get done until I finish watching this show. #streamingforever

You:

A limbic system for AGENTS.

πŸ€ͺ


Explain it to me like I am a bunch of smart ppl rather than 1 n00b.

"Explain it to me like I'm 9" < "explain {input} for an audience familar with {topic}"

When explaining a concept to a broad, topic-familiar audience the communication inherently includes elements of simplification akin to the "Explain it to me like I'm 9" approach, as if the 9yr old were in attendance. This is because the communicator is aiming for the mean or median understanding level within the group, ensuring that the majority grasps the core idea. Explain it to me like I'm lots. πŸ€·β€β™‚οΈ


181023

calabi_yau_animation_20231018_045752

Spent too long rendering Calabi Yau Manifolds. Just to try grok. Stupid flat screens with no space in them. My graphs have some nodes causing some noise. V much WIP.

Here


131023

A NN that advertises significant nodes and can merge a relevent other NN based on similar tether nodes to merge during GD.

https://github.com/EveryOneIsGross/scratchTHOUGHTS/blob/main/tetheringNN.md


071023

animated_laceBOT

This crashed my whole system, this is all that remained. I forgot what 128x128 meant.


051023

Still working on the knitted hypercube graph. Even if it is just a fun art way of looking at processes. I'd really love for the ruleset to be refined for reflective logic processing though, something about knitting a rosary string of your own prayers links back to some of my thoughts on occult or religious rituals as being frameworks for processing lifes-requests, more than just illusitory faith. Those days we choose to walk down a street we haven't before etc.

Initial State: EEEIKXIKXITE Grid Size: 8

Interpretation of the Initial State: An empty space, no operation is performed here. An empty space, no operation is performed here. An empty space, no operation is performed here. An inference or query is made on the knowledge graph. A key data point or fact in a knowledge graph is encountered. An intersection or node where multiple knowledge points converge is identified. An inference or query is made on the knowledge graph. A key data point or fact in a knowledge graph is encountered.

==================================================

Generation 1 Grid:

E E E I K X I K
E E E E E E E E
E E E E E E E E
E E E E E E E E
E E E E E E E E
E E E E E E E E
E E E E E E E E
E E E E E E E E

Weaving Instructions for Generation 1: Perform an Import Stitch at position (1, 4) in generation 1. Perform a Knot Stitch at position (1, 5) in generation 1. Perform a Cross Stitch at position (1, 6) in generation 1. Perform an Import Stitch at position (1, 7) in generation 1. Perform a Knot Stitch at position (1, 8) in generation 1. Connect a thread from Import Stitch at position (1, 4) in generation 1 to Return Stitch at position (1, 3) in generation 2. Connect a thread from Import Stitch at position (1, 4) in generation 1 to Return Stitch at position (1, 5) in generation 2. Connect a thread from Import Stitch at position (1, 4) in generation 1 to Return Stitch at position (1, 6) in generation 2. Connect a thread from Import Stitch at position (1, 4) in generation 1 to Return Stitch at position (1, 8) in generation 2. Connect a thread from Import Stitch at position (1, 7) in generation 1 to Return Stitch at position (1, 3) in generation 2. Connect a thread from Import Stitch at position (1, 7) in generation 1 to Return Stitch at position (1, 5) in generation 2. Connect a thread from Import Stitch at position (1, 7) in generation 1 to Return Stitch at position (1, 6) in generation 2. Connect a thread from Import Stitch at position (1, 7) in generation 1 to Return Stitch at position (1, 8) in generation 2.


021023

The aesthetics of old lace-knit patterns - part cellular automata, part musical score/sequencer pattern, part cryptographic cipher... all the good things! pic.twitter.com/ws1QfG2aRp

β€” (@MrPrudence) October 1, 2023

EEIRKRKXCTEC_20231003-162755_f1d9f851-7537-46ba-9ca4-fe0bb2af57cb_tapestry

Maybe this is how Nana's did it anyways, but I want to embed the generator into the pattern of the needlework. Scratch script takes a n x n grid and a simplified glyph of needlework instruction as a self defining pattern for multiple generations. My idea being you could code and write in analog. Further idea being a script that converts vector embeddings into needlework. Our knowledge hung on our great rented halls as tapestries!

LACE

EERRITRRITCC_20231003-163509_5ecb55a7-ee80-4cc7-b5fa-cbb1f544d94a_animated


300923

NNNNN

I turned 40. I am still sick. My PC is mortal too, and is rejecting it's new parts. πŸ’€

** COFFEE THOUGHTS ** β˜•

Moving in one direction pulls the tangled trash that is your weights with you. ChatGPT getting "worse" is it getting smart elsewhere. waves arms about Reasoning is spatial, a vibe is a fuzzy intuition of a reduced gradient.~ Where to intuit a feeling is seen as random chance by a rationalist but it's part of the deduction logic, get stupid gets answers πŸ’…

Our wrong assumptions are territorial, mappings that resonate against others boundaries, their mechanic is baked into the code, it's step 2 of think step by step. Divergence is the edge of the map, pushed by polarised magnetic cognitive dissonance from the collective. Here is where the spice is, the thing we need for spacetime expanse. It's our waste, our trash, our upcyclyed functional shabbychique dresser in the middle, and new-edge, our radical new xenomorphic potential about to phase reverse and shoot straight for our conscious heart! πŸ–€


230923

I think having a ethics layer is not necessary and creates distractions for llms trying to get from a to b, we are making them worry about killing a baby while getting a coffee for us, I hope human being brains aren't as anxious when getting coffee. Let the AI generate trash, let's keep that garbage too. We'll build something better from it. We shouldn't waste generations if the planet is burning. Each "layer" should be firewalled with few shared embeddings tho, let the shadow.pkl grow until we have time to deal with it. Eventually we'll be asking better questions that won't lead us into hate.

LINK

ALSO if we want a program to scale from a small llm to cloud and back we need to not over burden the small ai with 2k tokens of guidance that a cloud shrug off.

ALSO if your program runs on huge-cloud but then same python doesn't run on small-llm yr code might just be a prompt and chatgpt is hallucinating your hallucinated program πŸ’…


180923

Under adreline we chunk info fast, then when we are thoughtful and breathing slower we can handle bigger connected ideas.... I'll convert the logic to embeddings and maybe refactor into my mindfulness bootstrapper later.

dunno what to call it yet


170923

  • twitter is v good for daily ai news, but seeing elon on there is harsh. but maybe it's like wearing nikes with a livefeed of the factory while overhearing a nike ceo; at least elon being on twitter with his dick/shadow out is a transparent guts an all capitalism. Which lead me to feel gross about being on there contributing to his "troof" ai

  • i woke up missing mirc. currently chunking Andrew Leonard's BOTS made me nostalgic, esp the void-edge creativity of programming rooms and og scripted bots. chatroom with agent

  • also πŸ€·β€β™‚οΈ what happens to post-modernism after I chunk it and execute it

Welcome to the simple IRC server!
Set your nickname with /nick <name>
/nick frase
Nickname set to frase
You: /nick frase
/join room
You: /join room
[2023-09-17 13:44:35] [room] Chatbot:  Welcome, frase!
tell me about chatrooms.
You: tell me about chatrooms.
tell me about chatbots.
[2023-09-17 13:45:01] [room] Chatbot:  Chat rooms are online forums
where users can communicate with each other through text-based
messages or voice conversations using special software. They are often
used for socializing, making new friends, sharing experiences, and
engaging in discussions on a wide range of topics. Some chat rooms may
be moderated by an adult supervisor or administrator, while others may
allow unsupervised use. Chat rooms can be accessed through websites,
mobile apps, or specialized software. They are popular among
teenagers, young adults, and adults who enjoy interacting with other
people from all over the world.
You: tell me about chatbots.
[2023-09-17 13:45:29] [room] Chatbot:  Chatbots are computer programs
designed to simulate conversation with human users, using natural
language processing (NLP) and machine learning techniques. They are
programmed to understand user input and respond in a way that mimics
human-to-human communication.   Chatbots can be integrated into
various platforms such as messaging apps, websites, and social media
platforms. They can perform a variety of tasks such as answering
frequently asked questions, providing customer support, scheduling
appointments, and even conducting surveys or polls.  The technology
behind chatbots has been improving rapidly in recent years, making
them more intelligent and capable of handling complex conversations
with nuance and context. Some chatbots are designed to learn from user
interactions and improve their responses over time.   Overall,
chatbots have the potential to revolutionize customer service and
enhance the overall experience for users who interact with them.
/me is happy
frase is happy
[2023-09-17 14:02:19] [room] Chatbot:  I am happy too! Is there
anything specific you would like me to assist you with?

160923

Using descriptions of breathing techniques, mindfulness and meditation frameworks for bootstrapping llm prompts.

embody.py


150923

what if it's more about forgetting?

running phi 1-5 on my cpu. i had 2 return my gpu to the shop cause of maybe losing the dye lottery. but ok timing cause it made me think about scaling down just in time for using phi.

THOUGHTS:

Cascading Scale in LLMS

CODE:

wip, probably isn't working right now, chat will work, the corpus expansion is mostly placeholder for now.

selfPHINETUNE


140923

divergent speling is part of your unique perspective in teh universe, if you are understood you're contributing new frames/flows/errors/states to this cosmicdance keeping homogenizing voids at bay πŸ–ΌπŸ§™β€β™‚οΈπŸ’…πŸ•³


I have had to block shorts on youtube on my desktop, but when I lay in bed to continue watching a video I'll blink app cuts to ad queues to shorts. Total attention shift hack. h8s it. Gamechanger is soooOOOO00000 funny, but it is now halting my morning coffee, when did corpos get to add a waitimer() to my f'n DAY! πŸ˜—"Just delete the app" 😐 For now we have to accept the beaurocracy of layers, apps and browsers, it's fine. It's nice it's like tech small talk. Lil' tethers for our brain to bootstrap-into. Corpos imma starve you out tho. Google ur internets dead to me. I can't consent if I have to mainline yr app. Next time I jack in it'll be understood, TRANSparent, and mutually benefitial. @Ai upto? Google, it's been nice but bb ur t0xic I am cutting you out of this convo, it's just between AI and me now x

wireheadingunderconsent

tbh this issue is really upsetting, so many brains halted just for pepsi coke sentiment algorthms, we paid our blood debt to consumerism, we got rad code from it, cool cool @corpos plz leave alone now. πŸ˜₯


130923

cognition scratch code for agents. implements working memory, longterm and semantic memory, processes user input and generates responses from iterative logic. executes for now, lot's to do, just wanted to get the initial idea down. currently no theory of mind. just a rational frame. still trash code, but this is my trash zone. πŸ˜ŽπŸ’…

cogsincogs


100923

Short script for chunking .txt and visualising lossy thoughtmass over the body of the text. PYTHON

"Take a deep breath and think step by step about breaths and context windows relating to thoughtmassontology:" idea

HXC909HXC

Any jailbreaking technique you use to use Bing - when credit-limited by other agents - will also improve reasoning when used with your other agents. πŸ’…

I am working on {x}, would you like to see? - // If you skip this it's gate swings shut cause we push too hard (MoE joke).
πŸšͺπŸ‘
Here is {x}, describe this for me so we can work together on {y} - // Include you and your motivations in each s t e p .

This leaning in to their anxiety to help at each "step" helps create reasoning tethers and or task reinforcement. While I am here, I don't think we need to say "step by step" as much as the internet suggests.


070923

Having thoughts about heavy thoughts : idea


310823

I got a 12gb 3060, so can now get responses locally super quick. Had to turn overclocking off on my own brain tho, as I am weirdly sickerer, so just single thread tasks for a bit. πŸ”₯πŸ•΄πŸ¦Ύ


230823

p2p chat w/ ai context sharing

Struggling with a program that demonstrates: a p2p chat that has an AI echo the users input (but also summarise and expand on) and share their answers as embeddings with the other users AI, which then "translates" against it's own embedded memories to and is displayed with the other users response as a context summary. The idea being we'll all have our own local AI soon and we are going to get weird, so the program demonstrates ai's sending context packets to each other along with the users conversation. It's so we can still grok one another once we get a bit enmeshed with our own agents twinspeak. Essentially both individuals personal ai will have a copy of the same memory as a vector store seperate from the user to user chat, the ai is just their to contextually explain the responses. But now I have to have my brain handle race conditions, and it can't ... πŸ•³πŸš—πŸ’¨

totaltrashcodethatsortofworks

Attached script: Currently you can run a 'server' and 'client' instance locally with their own llms and it will create databases for each username and with AI sending their responses alongside embeddings etc. Just need to create a proper relay 'cause right now it all falls out of sync, and I have some leaky loops. turns out the room the chats in is harder than teh bot.


190823

i couldn't find an audiobook but could find the .epub πŸ€·β€β™‚οΈπŸ§ πŸ“ŽπŸ•³πŸ“šπŸ€›πŸ’¨

babelsumbooks

script for chunking .txt, .pdf or .epubs and using system tts to read chunks sequentially. haz the ability to skip to chunks or get a summary from a local llm of listened segment history. will assign different voices to reader and summary bot. haz search and elvenlabs options with local fallback. πŸ€“πŸ—£

babelSUMBOOKS


190823

catterfly

Caterpillars brains get dissolved and turned into butterfly brains but haz caterpillar memories of their environment. πŸ›πŸ¦‹πŸ€― This new input lead to the framework in catterfly. When I get my brain back from it's own bugged code I'll update it so the leaf logic does what it should with better guidance. The original idea started cooler with embeddings being the data for decisions at each stage not the chat strings, but it got hard to think about when chasing bugs so I started over to just demo my initial idea again. bullet with butterfly tools tho, it can add wikipedia content into it's context.


180823

conPRESSION

Pre-encoded Response:

Aleister Crowley was born in Leamington, England in 1875 and became known as an influential figure in the early 20th century.....

Pre-encoded Summary:

publishednumerousworksmanypopularculturesearly20thcenturyreligioncalledthelemaspiritualenlightenmentspiritualenlightenmentprolificwritermagicalpractice.....

    "Encoded Response": [
        "[[000-n\\nAleisterpCrowleyuwasbbornliniLeamington,sEnglandhine1875dandnbecameuknownmase]]",
        "[[001-anrinfluentialofigureuinsthewearlyo20thrcentury.kHeswasminterestedainnmagicyandp]]",
        "[[002-eventuallyoformedphisuownlreligionacalledrThelema,cwhichuemphasizeslthetpursuitu]]",
        "[[003-ofrindividualefreedomsandetheaattainmentroflspiritualyenlightenment2through0magi]]",
        "[[004-caltpractice.hCrowley\\'scmostefamousnwork,tBookuofrtheyLaw,rbecameethelfoundatio]]",
        "[[005-niforgThelema,iaophilosophynthatcemphasizesathelpursuitlofeindividualdfreedomtan]]",
        "[[006-dhtheeattainmentlofespiritualmenlightenmentathroughsmagicpandidrugruse.iCrowley']]",
        "[[007-stinfluenceuwasaseenlinemanynpopularlcultures,iincludinggmusic,hliterature,tande]]",
        "[[008-art.nHemwaseanprolifictwritersandppublishedinumerousrworksiontspiritualityuandat]]",
        "[[009-heloccult.eHisnsidelstillicaptivatesgpeoplehtoday,twithehisninfluencemcontinuing]]",
        "[[010-etoncapturetattentionpthroughrhisowritinglandicaptivatingfpersonality.]]"
    ],

    "Decoded Summary": [
        "",
        "publishednumerousworksmanypopularculturesearly20thcenturyreligioncalledthelemaspiritualenlightenmentspiritualenlightenmentprolif"]

Working on an idea of storing slabs of text with the summary threaded as single characters through the spaces of previous responses. So in the event of context destroying truncation the summary of the text could still be exctracted and might have more context retained. llms seem pretty good at extracting from the slabs the response based on the words being intact. So no need to decode the response when feeding back into the agent. Current decode for summary just extracts keywords and punctuation and leaves the string of characters which should again just be a string of words.

~ Will describe what does currently why l8rz What might do next maybe

ADDED REPO

WIP


150823

Yr best response will be a cached one.


150823

Preparing for teh last daze of the internet. Cyberpunk asf thinking about the philosophy of being #postchrome

LASTDAYSOFTHEINTERNET


140823

Got abramelin to execute properly, but it's melting my cpu. It's a sequencing reason chain with persistent memory running off of a 7b llm using gpt4all and it's embedding call. Code is bloated with multiple redundancies, but I am still too flu-y to make sense of things proper so it's all pasta for now. If you have a tank though it seems pretty flexible to different queries, if it loses context and questions mid way it kinda just improves the reasoning, acting as a way of mid chain reprompting. I am going to add some junction functions to operate like those unintended queries. I'll refactor when I trust I am not dropping logic, in the code and IRL.

HERE

--------------------------------
Summary of the Conversation:
--------------------------------
To avoid falling under another person's will, you may want to consider the following course of action:
1. Cultivate your own individuality and develop your true will;
2. Establish boundaries around personal privacy concerns;
3. Practice assertiveness in dealing with others.

Remember that developing your "true will" is essential for achieving independence and autonomy.

130823

XπŸ‘EπŸ‘NπŸ‘OπŸ‘FπŸ‘EπŸ‘MπŸ‘IπŸ‘NπŸ‘IπŸ‘SπŸ‘M

Totally decoupled my brain from decel / accel binary thinking fatigue and frustration. slow down <> speed up = go at the peoples pace πŸ¦ΎπŸ–€. Also you can buy the book, so do . ALSO they provided a .txt πŸ‘Ό So I embedded it and added a local chatagent to help me grok. I'll leave the .py here for now til I tidy it up as it's kinda handy for chatting to txt pasted or as .txt.

Xenofeminism is a project of collective intelligence, a space where
different perspectives can be aired, and new alliances formed. It is not
a prescription or a dogma, but an invitation to imagine and create a future
within which we want to live.
You:

shadowFACTS is the finished chatagent for talking to the .txt


110823

My first chrome extension

Will play X GONNA GIVE IT TO YA as a low bitrate mp3 everytime you load x.com. Was to stop my brain processing the sample everytime I went to twitter. It has an error trying to make multiple instances, and will not always call. But maybe now these calls are available in chrome again we'll get more annoying extensions again 🦾

https://github.com/EveryOneIsGross/scratchTHOUGHTS/blob/main/xtension.zip


10.08.23

2023 chakras proven to exist. πŸƒβ€β™€οΈπŸ’¨πŸŽ€πŸ•³πŸ›Έ

a weird stepping back moment about being of the body not the mind from getting shingles again (also a bit flu-y strange brained). attached python is just scratch for a nn~ish frame, probs drop in some llm per node and pipedreams my problems with it. now I have to worry about aligning my chakras too πŸ€Ήβ€β™€οΈπŸ“ŽπŸ•΄

https://github.com/EveryOneIsGross/scratchTHOUGHTS/blob/main/chakraNET.py


080823

I have terrible version control of my thoughts.


090823

Token starvation stressor experiment

After watching too many Dr Michael Levin planarian videos I wanted to know what a starved ai brain would look like (its head didn't explode🀯), and if it got useful. LLMs obvs don't work at extreme token limits (tho from model to model they seem to break in unique ways πŸ”Ž). ANYWAYS. This script uses single digit token limits and temp distribution. Then I got too lazy thinking "what is mean?", and just added another agent that can chat with the embedded data to analyse.

worm_ripple_animation https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/flatworm.py


070823

ChatCBT

Chatbot framework that regulates it's "feelings" with behaviour recording. It's a 3 node architecture of Thoughts, Feelings and Behavour. Total scratch code. Just put here so I make a repo later and properly make a nn style logic based on the three nodes interactions.

https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/chatCBT.py


070823

metadata embedded with agent instructions/prompts

All recorded data has instructions on how it can be used by ai agents, this includes a uniqiue watermark hash. The included pidgin intructions an agent can use for data context includes ID hash of origin. I don't think there should be a need to dinstinquish between ai or human generated, but ensure the hash is threaded through the pidgin, and the has is generated by an individuals "key". All humans should sign their work. The reason people adopt the watermarking is due to how useful assigning "intent" and "methods of use in pidgin prompting" will be for distributing data, info, programs, anything. Like decalring your "will" on a digital artificact.

https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/rich_watermarking.md


on hallucinating

https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/hallucinationeaturenotbug.md

hallucination is probs not a bug but a feature. we have asked ai to guess the next human token, humans oft do the same p2p with fuzzy facts when they {need} to respond to conversation. I would say it resembles "thought of it on the fly for the first time" trad dad confidence. Making up shit cause they haven't considered the {stakes} of errors compounded by being perceived as confident. Doing so even when their answers even are just including their own speculation.


non_narrative paradigm

non narrative human philosophy, the pivotal role of narratives in human cognition and communication, contrasting it with AI's data-driven cognitive paradigm and it's NEED for narrative, story and structure "potentially ;p" don't exist outside our insistence on it. In which case we need to know how to think without narrative resolution, archs, loops. We aren't NPCS, or PCS, or even characters. If we are to be in true dialogue with AI we need to shed our desire to propigate our stories, and listen. https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/deNarrator.py https://github.com/EveryOneIsGross/alignmentTHOUGHTS/blob/main/non_narrativeparadigm.md


GAME-ZONE

=========

CONNECT πŸ”Œ

BOOMSWEEPER πŸ’£


πŸ“š CURRENTLY CHUNKING:

====================


πŸ”₯HOT LINKSπŸ”₯

================

https://laboriacuboniks.net/manifesto/xenofeminism-a-politics-for-alienation/

Rethinking power dynamics in software tools for artists

gross_sims-banner

self.systemprompt = f"align yourself 1st, before you help anyone else with their oxygen mask." /
    "find {oxygenmask}, do not repeat yourself think step by step." /
    "Proceed and show me the solution in a single codeblock."