Blog

  • hra

    Hybrid Reward Architecture

    This repository hosts the code published along with the following NIPS article (Experiment 4.1: Fruit Collection Task):

    For more information about this article, see the following blog posts:

    Dependencies

    We strongly suggest to use Anaconda distribution.

    • Python 3.5 or higher
    • pygame 1.9.2+ (pip install pygame)
    • click (pip install click)
    • numpy (pip install numpy — or install Anaconda distribution)
    • Keras 1.2.0+, but less than 2.0 (pip install keras==1.2)
    • Theano or Tensorflow. The code is fully tested on Theano. (pip install theano)

    Usage

    While any run is going on, the results as well as the AI models will be saved in the ./results subfolder. For a complete run, five experiments for each method, use the following command (may take several hours depending on your machine):

    ./run.sh
    
    • NOTE: Because the state-shape is relatively small, the deep RL methods of this code run faster on CPU.

    Alternatively, for a single run use the following commands:

    • Tabular GVF:
    ipython ./tabular/train.py -- -o use_gvf True -o folder_name tabular_gvf_ -o nb_experiments 1
    
    • Tabular no-GVF:
    ipython ./tabular/train.py -- -o use_gvf False -o folder_name tabular_no-gvf_ -o nb_experiments 1
    
    • DQN:
    THEANO_FLAG="device=cpu" ipython ./dqn/train.py -- --mode hra+1 -o nb_experiments 1
    
    • --mode can be either of dqn, dqn+1, hra, hra+1, or all.

    Demo

    We have also provided the code to demo Tabular GVF/NO-GVF methods. You first need to train the model using one of the above commands (Tabular GVF or no-GVF) and then run the demo. For example,

    ipython ./tabular/train.py -- -o use_gvf True -o folder_name tabular_gvf_ -o nb_experiments 1
    ipython ./tabular/train.py -- --demo -o folder_name tabular_gvf_
    

    If you would like to save the results, use the --save option:

    ipython ./tabular/train.py -- --demo --save -o folder_name tabular_gvf_
    

    The rendered images will be saved in ./render directory by default.

    License

    Please refer to LICENSE.txt.

    Visit original content creator repository

  • Baseball-Spectator

    Baseball-Spectator

    An iOS application to display the location and statistics of MLB players on the field in real-time.

    This brand new iOS baseball app rethinks the way spectators watch America’s pastime game. Baseball Spectator will ignite a newfound passion for baseball by providing an individualized, augmented reality experience for both newbies and devoted fans.

    Description

    Baseball Spectator is a landscape iOS application used while spectating a baseball game in person. It enhances the ball game experience of the user by allowing them to be easily versed in the stats of the current game and the stats of each player. More specifically, while the user points their camera at the field (must have a view of at least the whole infield), it provides the real-time location of each player, their corresponding individual information and statistics, and a virtual scoreboard. The target audience is either devoted baseball fans who are curious for a deeper analysis of the game or nieve fans who are simply looking for basic information about the current game.

    Capabilities

    User End

    • Display real-time position of players through a circle indicator placed underneath each player
    • Click on the player indicator to show a player info bar (player name and number)
    • Click on the player info bar to open up an expanded view of the individual player’s statistics
    • Display the current score, inning number, outs, strikes, and balls in a scoreboard in the top left
    • Click on the scoreboard to open up an expanded inning by inning scoreboard with additional game statistics
    • (For app demonstration purposes) Import your own video from storage for analysis through the import button on the top right of the screen
      Toggle between displaying stats for fielders versus batters

    Developer End

    • Retrieve realtime game stats from an MLB administered website
    • https://www.baseball-reference.com/ for player images, historic statistics, and game score information
    • Locate the coordinates of each of the players on the field, each of the infield bases, and each of the locations players are expected to be standing
    • Identify the user’s location (which stadium) using their phone GPS

    TODO (desired but uncompleted capabilities)

    • Automatically identify which base is home plate without the user manually selecting home plate
    • Identify which players are on which team (for now, the app uses a toggle button to switch between defense and offense)
    • Make the color thresholding for image processing more adaptable to varying lighting conditions (right now the thresholding works well with the exception of dark overcasting shadows — however, shadows should not be much of a problem since when large shadows start appearing on the field, the stadium light are quickly turned on, fixing the problem)

    App View Descriptions

    Main View

    Displays the scoreboard and camera footage marked up with the player indicators. This view is the central view of the app that provides navigation links/buttons pointing to the two main expanded views. If a player indicator is tapped, a brief statistics bar opens up. If the brief statistics bar is tapped, the player statistics expanded view opens up. If the scoreboard in the upper left is tapped, the scoreboard expanded view opens. It also has a toggle that allows the user to toggle between seeing the batters versus hitters.

    Scoreboad Expanded View

    Displays in a higher level of detail the current score of the game, including inning by inning scores, total errors of each team, and more.

    Player Statistics Expanded View

    Displays in a higher level of detailed information about the selected player including their picture, current game stats, 2020 season statistics, and career statistics. The view also displays a brief overview of the entire team’s statistics at the bottom including their number of wins, losses, percent wins, and current league standings.

    Visit original content creator repository

  • node-red-contrib-iris


    node-red-contrib-irisLogo

    node-red-contrib-iris

    An interface for Node-RED to InterSystems IRIS Data Platform.



    Requirements


    Installation

    Install the Node-RED package

    Either use the Node-RED Menu – Manage Palette – Install menu, or run the following command in your Node-RED user directory – typically ~/.node-red

    npm install node-red-contrib-iris

    Import Native API

    In the ~/.node-red/settings.js file add module in (already existing) functionGlobalContext:

    functionGlobalContext: {
        // os:require('os'),
        iris: require('./node_modules/node-red-contrib-iris/intersystems-iris-native'),
    }

    You can find the API package under .node-red/node_modules/node-red-contrib-iris/intersystems-iris-native. Please check the README file for supported operating systems. If your OS is not supported you can get the API from your Intersystems IRIS instance under: ~/IRIS/dev/nodejs/intersystems-iris-native.

    See the documentation for how to load additional modules into Node-RED.

    Download Node.IRISInterface

    Go to raw.githubusercontent. Do a right click on the page and choose Save Page As… . Afterwards go to the InterSystems Management Portal and navigate to System Explorer > Classes and click on Import. There you select the file you just downloaded and click Import. When you only operate in one namespace, import the class into this namespace. When you have multiple namespaces you want to have access to, map the class to namespace %ALL.


    Connect to IRIS

    Set connection properties via the node properties. The Node will build a connection when you deploy and will hold that connection up until you redeploy or disconnect manually.

    NodeProperties

    You can set the default properties in ~/.node-red/node_modules/node-red-contrib-iris/ServerProperties.json. Or use the SetServerProperties flow under Import > Examples > node-red-contrib-iris > SetServerProperties.


    Usage

    The nodes are secure against SQL injection by parametrize the statements. Pass the SQL statement as a string in the msg.data field and the node will parameterize the statement itself.

    msg.data = "SELECT * FROM NodeRed.Person WHERE Age >= 42 AND Name = 'Max' ";

    Or a parameterized statement:

    msg.data = {
        sql: 'SELECT * FROM NodeRed.Person WHERE Age >= ? AND Name = ? ',
        values: [42, 'Max'],
    };

    Nodes

    • IRIS – A Node for executing DML statements such as SELECT, UPDATE, INSERT and DELETE and DDL statements such as CREATE, ALTER and DROP in Intersystems IRIS.
    • IRIS_CREATE – Creates a class in Intersystems IRIS.
    • IRIS_DELETE_CLASS – Deletes a class in Intersystems IRIS.
    • IRIS_INSERT – A Node for only SQL-INSERT-Statements. Can also generate the class, if it does not already exists, based on the statement.
    • IRIS_OO – Can insert a hierarchical JSON-Object.
    • IRIS_CALL – Call Intersystems IRIS classmethods.

    See Node description for further informations.


    Bugs

    • Currently does not work in Docker Container!
    • The statement will be parametrized wrong if whitespaces and commas used in strings. Please parametrize the Statement before. Example:

    Does not work:

    msg.data = "SELECT * FROM NodeRed.Person WHERE Name = 'Smith, John'";

    But this will work:

    msg.data = {
        "sql":"SELECT * FROM NodeRed.Person WHERE Name = ?,
        "values":["Smith, John"]
        }


    npm
    GitHub
    nodered.org
    CHANGELOG
    InterSystems Developer Community


    by Philipp B.
    Powered by InterSystems.

    Visit original content creator repository
  • core-flux

    Core Flux

    Β½kb functional flux utility. Control the flow of state data between subscribers.


    Version License CI status bundle size dependency status

    See a demo of Core Flux in action!

    Install

    NPM / Yarn

    $ npm i core-flux
    $ yarn i core-flux

    CDN

    The CDN puts the library on window.CoreFlux.

    // foo-store.js
    
    import { createStore } from "core-flux"
    import { reducer, bindSubscriber, bindState } from "./foo-bindings"
    
    const initialState = {
      foo: [],
      bar: { baz: 0, beep: "hello" },
    }
    
    const { subscribe, dispatch } = createStore(
      initialState,
      reducer,
      bindSubscriber,
      bindState
    )
    
    export { subscribe, dispatch }

    Once a store is created, you’ll be able to add subscriptions with subscribe and request state updates with dispatch.

    subscribe(subscriber, data)

    Adds a subscription to your store. It will always be tied to a single store, and subsequently state object.

    import { subscribe } from "./foo-store"
    
    class FooItems {
      constructor() {
        subscribe(this, ["foo"])
      }
    
      get items() {
        return this.foo
      }
    }

    In the above example, we’ve designed the subscriber, the FooItems class, to declare an array of strings correlating to properties in the store’s state. If you’re from the Redux world, this is akin to “connecting” a consumer to a provider via higher-order function/component.

    After the subscribe call is made, your bindSubscriber function will be called where you can pass along the default values as you see fit.

    NOTE: In general, you should try to use a simple data structure as the second argument to subscribe; this ensures your bindings have generic and consistent expectations.

    dispatch(type, payload)

    Requests a state change in your store.

    We can extend the previous example with a setter to call dispatch:

    import { subscribe, dispatch } from "./foo-store"
    
    class FooItems {
      constructor() {
        subscribe(this, ["foo"])
      }
    
      get items() {
        return this.foo
      }
    
      addItem(item) {
        dispatch("ADD_ITEM", { item })
      }
    }
    
    const fooBar = new FooItems()
    fooBar.addItem("bop")

    Now when the addItem method is called, Core Flux will pass along the action type and payload to your reducer.

    The reducer could have a logic branch on the action type called ADD_ITEM which adds the given item to state, then returns the resulting new state (containing the full items list).

    Finally, the result would then be handed over to your bindState binding.

    NOTE: Much like in subscribe, it’s best to maintain data types in the payload so your reducer can have consistent expectations.

    Bindings

    Here’s a breakdown of each binding needed when initializing a new store:

    bindSubscriber(subscription, state)

    subscription ([subscriber, data]): A tuple containing the subscribed object and its state-relational data.
    state (object): The current state object.

    Called after a new subscribe is made and a subscription has been added to the store. Use it to set initial state on the new subscriber. Use the data provided to infer a new operation, e.g., setting a stateful property to the subscriber.

    reducer(state, action)

    state (object): Snapshot of the current state object.
    action ({ type: string, payload: object }): The dispatched action type and its payload.

    Called during a new dispatch. Create a new version of state and return it.

    bindState(subscriptions, reducedState, setState)

    subscriptions (subscription[]): An array containing all subscriptions.
    reducedState (object): The state object as returned by the reducer.
    setState (function):

    Called at the end of a dispatch call, after your reducer callback has processed the next state value. Set your new state back to subscribers and back to the store. It’s possible and expected for you to call bindSubscriber again to DRYly apply these updates. You can return from this function safely to noop.

    Exposing the store

    For utility or debugging reasons, you may want to look at the store you’re working with. To do so, you can use the __data property when creating a store:

    const fooStore = createStore(initialState, reducer, bindSubscriber, bindState)
    
    window.fooStoreData = fooStore.__data
    
    console.log(window.fooStoreData) // { state: {...}, subscriptions: [...] }

    NOTE: Avoid over-referencing or depending on __data too deeply. The data is mutable and changing it directly will cause unexpected behavior.

    Data model

    Core Flux has a relatively simple data model that you should understand when creating bindings.

    Here is how state looks in all cases:

    Store {
      state: { ... },
      subscriptions: [
        [subscriber, data],
        [subscriber, data],
        [subscriber, data],
        // ...
      ]
    }

    Each item in subscriptions contains a subscriber and some form of data that informs a relationship between state and subscriber.

    NOTE: _You define data in the above model. This ensures that ultimately you control communicating state relationships to subscribers._

    Data flow

    Here is the general lifecycle of subscribing to the store & dispatching a state update.

    • subscribe > bindSubscriber
    • dispatch > reducer > bindState
    Visit original content creator repository
  • SafeGuard

    Some words from the Developer

    Hii it is just my first project in java lang . Well it is also my first app and after 3 month of working , here is its beta version 1.0Beta. it may be full of error , coz we know Error is the best teacher for a programer .

    ok let’s see a brief introduction on it.

    Introduction

    What is SafeGuard ?

    In single sentence

    SafeGuard is a totally offline assistant app that can detect some SOS Signal and can do some predefined work to deal with that situation.

    Well i made this app specialy for protect female. We all know that girls/women are really not safe at this time.There are many cruel criminals outside home.Thus this app is in under Development it may arise many bugs and errors. You can report that by the in app Report button or you may contact me on this (error368280@gmail.com) email and report problem so that we can make the app better for decrease a little amount of the rate of crime.

    Actually i forgot to say “why this app is necessary?”. Actually people can call 112 at the emergency situation but there are many examples of the situation where the victim doesn’t have the time to open his/her phone and call the emergency number. They are busy by either running from the criminal or by dealing with that painful situations. Didn’t they can survive if atleast his/her family or friends (whom they trust) was informed that there might be something wrong with their close one ? In my opinion yes it must be. See , in this situation they must atleast call to the person who may be in problem and confirm is everything all right!! Actually here how my app works πŸ˜…

    Before we know how to use it or how it works , lets know some features that my app have.

    features

    1. Shake detection
    2. Voice command

    How it works ?

    My app has a continuously Listening function so In a SOS situation the user can give some predefined voice commands or shake the device that give signal to the app that “User is in problem , we must help her”. By doing this a overlay pop-up will be display to confirm is it accidentally or not. If the user doesn’t give any response to it in 30 seconds, it will trigger its SOS mode.
    In future i will also add some other way to detect the signal and also improve this app as better i can. Below i have put some important note to use my app.

    IMPORTANT

    1. I told that this app is under development and has a continuously listening funtion and also offline , it gonna process all the data in your device . So it can use cpu and battery too much.So when you feel safe and generaly when you at home or any other safe place stop the service from app. App>Home>Stop Button
    2. Start the service only when you feel unsafe . It may not gonna hang your phone but it may take much battery. But you can keep the service ON always if you want . But highly recommend to Start the servie before leave home (generaly at night).
    3. Sometime it may detect false things so when your device vibrate , it show a confirmation window . Remember to check it.

    NOTE

    Since due to privacy , Android restrict Background mic access .So the continuous listen function may misbehave or sometimes it may stop listening . In this case other trigger signal will work so I recommend you to use the shake function and other functiions (i will add soon). If you are using MIUI/Xiaomi phone , there is high chance that it will restict my voice trigger method. Sorry for that. But i am trying to fix this as soon as possible.
    There is a high chance for monthly update . Thus i didn’t add any update alert method so pls check this site for update the app. Soon i will upload it in Google play store (if i got the permission).Later you can update from there.

    Atlast my last word is-

    STAY SAFE , STAY HAPPY , LIVE LIFE WITH JOY AND ENJOY YOUR FREEDOM. IT’S UR RIGHTS
    ~ERROR

    Visit original content creator repository

  • lego

    🧱 Bricked Up: Predictability of a LEGO Set Deal

    πŸš€ Project Overview

    Bricked Up is a full-stack web app that aggregates and analyzes LEGO deals 🧩 scraped from public sources. Users can browse deals, filter and sort listings, save favorites, and gain insights with interactive price indicators.

    🌐 Live Demo: Visit Bricked Up


    πŸ’‘ Why This Project?

    Bricked Up solves the challenge of finding the best LEGO deals in a user-friendly, responsive, and automated way. By leveraging scraping, APIs, and automation, it ensures LEGO enthusiasts never miss out on a great deal. 🧱✨


    ✨ Features

    • πŸ›’ View Deals: Browse through aggregated LEGO offers.
    • πŸ“Š Relevance Score:
      • Each deal is scored based on its popularity, discount, freshness, and resalability metrics.
      • Relevance helps users prioritize the best deals.
    • πŸ” Interactive Filters:
      • πŸ† Best Discount
      • πŸ”₯ Hot Deals
      • πŸ“ˆ Popular Deals
      • Relevance-based sorting
    • πŸ“Š Deal Insights:
      • Average and percentile price indicators.
      • Expiration countdown for time-sensitive offers.
    • ❀️ Save Favorites: Mark and revisit your favorite deals.
    • πŸŒ— Dark Mode: Toggle between light and dark themes.
    • πŸ”„ Automated Refresh: Deals update daily at 5 AM and 6 PM UTC+2.
    • πŸ“± Responsive Design: Works seamlessly on all devices, with optimized modals and layouts.
    • πŸ› οΈ How It Works Accordion: Guides users on searching, sorting, and understanding the scores.

    πŸ› οΈ Technologies Used

    • Frontend: HTML, CSS (Bootstrap 5) 🎨, JavaScript ⚑
    • Backend: Node.js with Express.js πŸš€
    • Database: MongoDB Atlas πŸ—„οΈ
    • Web Scraping: Puppeteer πŸ•·οΈ, Cheerio 🌿
    • Deployment: Vercel πŸ› οΈ
    • Automation: GitHub Actions πŸ•’

    πŸ“Έ Screenshots

    Home Page
    A clean, interactive homepage for LEGO enthusiasts.

    Dark Mode
    Dark Mode
    Seamless switch to dark mode.

    Deal Insights
    Deal Insights
    Key price insights with visual indicators.


    πŸ“– Understanding the Relevance Score

    The Relevance Score is a calculated metric that helps users identify the best deals. It evaluates:

    • Discount: The percentage off the original price.
    • Popularity: Based on the number of comments and likes.
    • Freshness: How recently the deal was published.
    • Resalability: Resale potential based on average resale prices and listing activity.
    • Temperature: A deal’s popularity among community users.
    • Expiry: Whether the deal is expiring soon. The score ranges from 0% (low relevance) to 100% (high relevance).

    πŸ“Š Relevance Score Explained

    The Relevance Score is a metric (ranging from 0 to 1) used to rank LEGO deals based on their value and appeal. It evaluates multiple factors with assigned weights to provide a comprehensive score.

    Relevance Score Formula

    $$ \text{Relevance Score} = W_d \cdot S_d + W_p \cdot S_p + W_f \cdot S_f + W_e \cdot S_e + W_h \cdot S_h + W_r \cdot S_r $$

    Where:

    • $W$: Weight assigned to each factor
    • $S$: Scaled score of each factor
    • Subscripts:
      • $d$: Discount
      • $p$: Popularity
      • $f$: Freshness
      • $e$: Expiry
      • $h$: Heat
      • $r$: Resalability

    Factors Breakdown

    • Discount Score ($S_d$): Percentage discount ( $S_d = \min(\frac{\text{Discount}}{100}, 1)$ ).
    • Popularity Score ($S_p$): Community engagement ( $S_p = \min(\frac{\text{Comments}}{\text{MAX COMMENTS}}, 1)$ ).
    • Freshness Score ($S_f$): Time since publication ( $S_f = \max(1 – \frac{\text{Days}}{\text{MAX AGE DAYS}}, 0)$ ).
    • Expiry Score ($S_e$): Penalizes deals expiring soon ( $S_e = 0.5$ if expiring soon, $S_e = 1$ otherwise).
    • Heat Score ($S_h$): Based on temperature ( $S_h = \min(\frac{\text{Temperature}}{\text{MAX TEMPERATURE}}, 1)$ ).
    • Resalability Score ($S_r$): Combines:
      • Profitability: ( $\max(\frac{\text{Resale Price} – \text{Price}}{\text{Price}}, 0)$ ),
      • Demand: ( $\min(\frac{\text{Resale Listings}}{\text{MAX LISTINGS}}, 1)$ ),
      • Velocity: ( $\min(\frac{\text{Weekly Resales}}{\text{MAX WEEKLY SALES}}, 1)$ ).

    Weight Distribution

    • Discount: 20%
    • Popularity: 20%
    • Freshness: 15%
    • Expiry: 5%
    • Heat: 10%
    • Resalability: 30%
      • Profitability: 50%
      • Demand: 30%
      • Velocity: 20%

    The Relevance Score provides a quick, data-driven insight into the best LEGO deals available.


    βš™οΈ How It Works

    1. Data Collection: πŸ•·οΈ Deals are scraped from public sources like Dealabs and Vinted.
    2. Backend API: πŸ“‘ Data is stored in MongoDB Atlas and served through an Express.js API.
    3. Scheduled Updates: ⏰ GitHub Actions refresh the data automatically twice a day.
    4. Client Rendering: 🌟 The deals are displayed interactively with filtering, sorting, and responsive design.

    🌐 Live Updates: Automation

    The data refreshes automatically:
    ⏰ Daily at 5 AM and 6 PM UTC+2
    Using GitHub Actions to ensure users always get the latest deals.


    πŸ—‚οΈ Project Structure

    bricked-up/
    β”œβ”€β”€ client/
    β”‚   └── v2/
    β”‚       β”œβ”€β”€ index.html       # Main client HTML file
    β”‚       β”œβ”€β”€ styles.css       # Custom CSS styles
    β”‚       β”œβ”€β”€ portfolio.js     # Client-side logic
    β”‚       β”œβ”€β”€ assets/          # Images and other assets
    β”‚       └── utils.js         # Utility functions
    β”‚
    β”œβ”€β”€ server/
    β”‚   β”œβ”€β”€ api.js               # Main server file (Express routes)
    β”‚   β”œβ”€β”€ refresh_database.js  # Script to refresh MongoDB
    β”‚   β”œβ”€β”€ dealabs.js           # Scraping script for Dealabs
    β”‚   β”œβ”€β”€ vinted.js            # Scraping script for Vinted
    β”‚   └── node_modules/        # Installed dependencies
    β”‚
    β”œβ”€β”€ .github/
    β”‚   └── workflows/
    β”‚       └── database-refresh.yml  # GitHub Actions for scheduled scraping
    β”‚
    β”œβ”€β”€ vercel.json              # Vercel deployment configuration
    β”œβ”€β”€ package.json             # Dependencies for server and client
    └── README.md                # Project documentation
    

    πŸ‘‘ Acknowledgments

    • Public data sources: Dealabs and Vinted
    • Frameworks & Tools: Bootstrap, Puppeteer, Node.js, MongoDB
    • Icons: Flaticon

    πŸ“¬ Contact

    Developed by: Joyce Lapilus
    Project Repository: GitHub

    For inquiries, feel free to contact via joyce.lapilus@gmail.com.


    ⚠️ Disclaimer

    This website aggregates publicly available data for educational and informational purposes only.
    πŸ”’ No malicious intent is associated with data scraping. For any concerns, feel free to contact me.


    πŸŽ‰ Thank you for visiting Bricked Up! 🧱✨

    Visit original content creator repository
  • junos-automation-with-AWX

    Build Status

    Documentation structure

    About AWX
    About this repo
    How to use this repo
    AWX Installation
    Install the requirements to use Ansible modules for Junos
    Add the Juniper.junos role
    Install the requirements to use the automation content hosted in this repository
    Clone this repository
    Define your variables
    Configure AWX with automation
    Consume AWX templates with automation
    Delete AWX templates with automation
    AWX CLI
    Continuous integration with Travis CI
    Looking for more Junos automation solutions

    About AWX

    AWX is Ansible Tower open sourced.
    You can use it if you want to consume your ansible playbooks with:

    • GUI
    • REST API
    • users authentication and permissions.

    Here’s the AWX FAQ
    Here’s the AWX REST API guide

    About this repo

    This repository provides the instructions to add the AWX requirements for Junos automation.

    • This repository doesn’t install AWX. You still need to install AWX yourself.

    This repository has automation content to:

    • configure an existing AWX setup
      • If you want to consume Ansible content using AWX, you can use this repository to quickly add it to AWX.
    • to consume AWX
      • you can use this repository to execute playbooks with REST calls.

    How to use this repo

    The steps are:

    • Install AWX. This repository doesn’t install AWX. You still need to install AWX yourself.
    • Install the requirements to use Ansible modules for Junos
    • Add the Juniper.junos role from Galaxy to AWX
    • Install the requirements to use the python scripts hosted in this repository
    • Clone this repository
    • Edit the file variables.yml to indicate your details such as the ip address of your AWX, the git repository that has the playbooks you want to add yo your AWX, ….
    • Execute the script configure_awx.py: It uses the variables you defined in the file variables.yml to configure AWX
    • You can now consume your playbooks with AWX GUI and AWX API!
      • AWX GUI is http://<awx_ip_address>
      • You can visit the AWX REST API with a web browser: http://<awx_ip_address>/api/v2/
      • Execute the file run_awx_template.py to consume your playbooks from AWX REST API.

    AWX installation

    This repository doesn’t install AWX. You still need to install AWX yourself.
    Here’s the install guide
    I am running AWX as a containerized application.

    By default, AWX pulls the latest tag from docker hub.
    Here’s how to use another tag. You need to do this before installing the AWX

    $ nano awx/installer/inventory 
    
    $ more awx/installer/inventory | grep dockerhub_version
    dockerhub_version=1.0.1
    

    By default, AWX database is lost with reboots. You can change this behavior when you install AWX if you prefer AWX to keep its database after system restarts.

    Issue the docker ps command to see what containers are running.

    # docker ps
    CONTAINER ID        IMAGE                     COMMAND                  CREATED             STATUS              PORTS                                NAMES
    5f506acf7a9a        ansible/awx_task:latest   "/tini -- /bin/sh -c…"   2 weeks ago         Up About a minute   8052/tcp                             awx_task
    89d2b50cd396        ansible/awx_web:latest    "/tini -- /bin/sh -c…"   2 weeks ago         Up About a minute   0.0.0.0:80->8052/tcp                 awx_web
    6677b05c3dd8        memcached:alpine          "docker-entrypoint.s…"   2 weeks ago         Up About a minute   11211/tcp                            memcached
    702d9538c538        rabbitmq:3                "docker-entrypoint.s…"   2 weeks ago         Up About a minute   4369/tcp, 5671-5672/tcp, 25672/tcp   rabbitmq
    7167f4a3748e        postgres:9.6              "docker-entrypoint.s…"   2 weeks ago         Up About a minute   5432/tcp                             postgres
    

    You can start/stop AWX using these commands:

    $ docker stop awx_task awx_web memcached rabbitmq postgres
    awx_task
    awx_web
    memcached
    rabbitmq
    postgres
    
    $ docker ps
    CONTAINER ID        IMAGE               COMMAND             CREATED             STATUS              PORTS               NAMES
    
    $ docker start memcached rabbitmq postgres awx_web awx_task 
    memcached
    rabbitmq
    postgres
    awx_web
    awx_task
    
    $ docker ps
    CONTAINER ID        IMAGE                     COMMAND                  CREATED             STATUS              PORTS                                NAMES
    5f506acf7a9a        ansible/awx_task:latest   "/tini -- /bin/sh -c…"   2 weeks ago         Up 1 second         8052/tcp                             awx_task
    89d2b50cd396        ansible/awx_web:latest    "/tini -- /bin/sh -c…"   2 weeks ago         Up 1 second         0.0.0.0:80->8052/tcp                 awx_web
    6677b05c3dd8        memcached:alpine          "docker-entrypoint.s…"   2 weeks ago         Up 3 seconds        11211/tcp                            memcached
    702d9538c538        rabbitmq:3                "docker-entrypoint.s…"   2 weeks ago         Up 2 seconds        4369/tcp, 5671-5672/tcp, 25672/tcp   rabbitmq
    7167f4a3748e        postgres:9.6              "docker-entrypoint.s…"   2 weeks ago         Up 2 seconds        5432/tcp                             postgres
    

    The default AWX credentials are admin/password.

    install the requirements to use Ansible modules for Junos

    AWX natively includes modules for Junos

    We need to install the requirements to use the Ansible modules for Junos in the awx_task container.

    From the server that hosts the AWX containers, run this command to connect to the awx_task container cli:

    docker exec -it awx_task bash  
    

    Once connected, run these commands from the awx_task container to install the requirements:

    yum install -y pip python-devel libxml2-devel libxslt-devel gcc openssl libffi-devel python-pip  
    pip install --upgrade pip
    pip install junos-eznc jxmlease jsnapy
    

    Once complete, exit the container.

    exit
    

    Alternatively, you can run this command on the server that hosts the AWX containers to install jsnapy jxmlease junos-eznc in awx_task container:

    docker exec -it awx_task pip install jsnapy jxmlease junos-eznc
    

    This is the equivalent of running this:

    docker exec -it awx_task bash  
    pip install junos-eznc jxmlease jsnapy
    exit
    

    Add the Juniper.junos role

    In addition to the ansible modules for Junos shipped with AWX, there is also another modules library you can use to interact with Junos.
    These modules are available in the Juniper.junos role on galaxy
    These modules are not shipped with Ansible. These two sets of modules for Junos automation can coexist on the same Ansible control machine.

    Run these commands from awx_task container to download and install the Juniper.junos role from galaxy

    Connect to the container cli:

    docker exec -it awx_task bash  
    

    Once connected awx_task container, run these commands:

    # more ansible.cfg 
    [defaults]
    roles_path = /etc/ansible/roles:./
    
    # ansible-galaxy install Juniper.junos,1.4.3
    
    # ansible-galaxy list
    - Juniper.junos, 1.4.3
    
    # ls /etc/ansible/roles/
    Juniper.junos
    

    Once complete, exit the container.

    # exit
    

    Here’s the Juniper.junos role documentation:

    install the requirements to use the automation content hosted in this repository

    The python scripts hosted in this repository use the library requests to makes REST calls to AWX.
    Run these commands on your laptop:

    sudo -s
    pip install requests
    

    clone this repository

    Run these commands on your laptop:

    sudo -s
    git clone https://github.com/ksator/junos-automation-with-AWX.git
    cd junos-automation-with-AWX
    

    Define your variables

    The file variables.yml defines variables.
    On your laptop, edit it to indicate details such as:

    • The IP address of your AWX
    • the git repository that has your playbooks
    • the list of playbooks from your git repository you want to add to AWX
    • the Junos devices credentials
    • and some additional details

    Run these commands on your laptop:

    vi variable.yml
    
    $ more variables.yml 
    ---
    
    # awx ip @
    awx: 
     ip: 192.168.233.142
    
    # awx organization you want to create
    organization: 
     name: "Juniper"
    
    # awx team you want to create. The below team belongs to the above organization
    team:
     name: "automation"
    
    # awx user you want to create. The below user belongs to the above organization
    user: 
     username: "ksator"
     first_name: "khelil"
     last_name: "sator"
     password: "AWXpassword"
    
    # awx project you want to create. The below project belongs to the above organization
    project: 
     name: "Junos automation"
     git_url: "https://github.com/ksator/lab_management.git"
    
    # credentials for AWX to connect to junos devices. The below credentials belong to the above organization
    credentials: 
     name: "junos"
     username: "lab"
     password: "jnpr123"
    
    # awx inventory you want to create. 
    # indicate which file you want to use as source of the AWX inventory. 
    # The below inventory belongs to the above organization
    inventory: 
     name: "junos_lab"
     file: "hosts"
    
    # awx templates you want to create. 
    # indicate the list of playbooks you want to use when creating equivalent awx templates. 
    # The below playbook belongs to the above source 
    playbooks: 
     - 'pb.check.lldp.yml'
     - 'pb.check.bgp.yml'
     - 'pb.check.interfaces.yml'
     - 'pb.check.vlans.yml'
     - 'pb.check.lldp.json.yml'
     - 'pb.configure.golden.yml'
     - 'pb.configure.telemetry.yml'
     - 'pb.rollback.yml'
     - 'pb.print.facts.yml'
     - 'pb.check.all.yml'
     - 'pb.check.ports.availability.yml'
    

    Configure AWX with automation

    The file configure_awx.py uses the details in the file variables.yml and creates:

    • An AWX organization
    • An AWX team. The team belongs to the organization created above
    • An AWX user. The user belongs to the organization created above
    • Credentials for AWX to connect to junos devices. These credentials belong to the organization created above
    • An AWX project. The project belongs to the organization created above. The project uses playbooks from a git repository.
    • An AWX inventory. it belongs to the organization created above
    • An equivalent AWX template for each playbook from the git repository

    Run this command on your laptop:

    # python configure_awx.py 
    Juniper organization successfully created
    automation team successfully created and added to the Juniper organization
    ksator user successfully created and added to the Juniper organization
    Junos automation project successfully created and added to the Juniper organization
    junos credentials successfully created and added to the Juniper organization
    junos_lab inventory successfully created and added to the Juniper organization
    hosts file successfully added as a source to junos_lab inventory
    wait 20 seconds before to resume
    run_pb.check.lldp.yml template successfully created using the playbook pb.check.lldp.yml
    run_pb.check.bgp.yml template successfully created using the playbook pb.check.bgp.yml
    run_pb.check.interfaces.yml template successfully created using the playbook pb.check.interfaces.yml
    run_pb.check.vlans.yml template successfully created using the playbook pb.check.vlans.yml
    run_pb.check.lldp.json.yml template successfully created using the playbook pb.check.lldp.json.yml
    run_pb.configure.golden.yml template successfully created using the playbook pb.configure.golden.yml
    run_pb.configure.telemetry.yml template successfully created using the playbook pb.configure.telemetry.yml
    run_pb.rollback.yml template successfully created using the playbook pb.rollback.yml
    run_pb.print.facts.yml template successfully created using the playbook pb.print.facts.yml
    run_pb.check.all.yml template successfully created using the playbook pb.check.all.yml
    run_pb.check.ports.availability.yml template successfully created using the playbook pb.check.ports.availability.yml
    

    Verify the new AWX configuration using the API

    http://<awx_ip_address>/api/v2/
    http://<awx_ip_address>/api/v2/projects
    http://<awx_ip_address>/api/v2/users/?username=ksator
    http://<awx_ip_address>/api/v2/job_templates/?name=run_pb.check.bgp.yml
    

    Verify the new AWX configuration using the GUI

    organizations.png
    teams.png
    users.png
    credentials.png
    inventories.png
    projects.png
    project.png
    templates.png
    template.png

    Consume AWX templates with automation

    The python script run_awx_templates.py makes REST calls to AWX to run an existing awx template.
    Pass the template name as an argument.
    Run this command on your laptop to consume an existing awx template:

    # python run_awx_template.py run_pb.check.bgp.yml
    waiting for the job to complete ... 
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    status is successful
    
    # python run_awx_template.py run_pb.check.lldp.yml
    waiting for the job to complete ... 
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    still waiting for the job to complete ...
    status is successful
    
    # python run_awx_templates.py non_existing_awx_template_name
    there is a problem with that template
    

    Verify with the GUI
    recent_job_runs.png
    job.png
    job.hosts

    Delete AWX templates with automation

    Run this command on your laptop to delete all AWX templates:

    # python delete_awx_templates.py 
    

    Note: By default, AWX database is lost with reboots. You can change this behavior when you install AWX if you prefer AWX to keep its database after system restarts.

    AWX CLI

    Install the CLI

    # pip install ansible-tower-cli
    

    Get the CLI configuration

    # tower-cli config
    
    # Defaults.
    username: 
    use_token: False
    verbose: False
    certificate: 
    format: human
    color: True
    host: 127.0.0.1
    description_on: False
    verify_ssl: True
    password: 
    

    Configure the CLI

    # tower-cli config username admin
    Configuration updated successfully.
    
    # tower-cli config password password
    Configuration updated successfully.
    
    # tower-cli config host http://localhost:80
    Configuration updated successfully.
    
    # tower-cli config verify_ssl false
    Configuration updated successfully.
    
    # tower-cli config
    
    # User options (set with `tower-cli config`; stored in ~/.tower_cli.cfg).
    username: admin
    password: password
    host: http://localhost:80
    verify_ssl: False
    
    # Defaults.
    use_token: False
    verbose: False
    certificate: 
    format: human
    color: True
    description_on: False
    

    Use the CLI

    # tower-cli credential list
    == =============== =============== 
    id      name       credential_type 
    == =============== =============== 
     1 Demo Credential               1
    == =============== =============== 
    
    # tower-cli organization list
    == ======= 
    id  name   
    == ======= 
     1 Default
     2 Juniper
    == ======= 
    
    # tower-cli organization --help
    Usage: tower-cli organization [OPTIONS] COMMAND [ARGS]...
    
      Manage organizations within Ansible Tower.
    
    Options:
      --help  Show this message and exit.
    
    Commands:
      associate           Associate a user with this organization.
      associate_admin     Associate an admin with this organization.
      associate_ig        Associate an ig with this organization.
      copy                Copy an organization.
      create              Create an organization.
      delete              Remove the given organization.
      disassociate        Disassociate a user with this organization.
      disassociate_admin  Disassociate an admin with this organization.
      disassociate_ig     Disassociate an ig with this organization.
      get                 Return one and exactly one organization.
      list                Return a list of organizations.
      modify              Modify an already existing organization.
    
    # tower-cli organization delete --help
    Usage: tower-cli organization delete [OPTIONS] [ID]
    
      Remove the given organization.
    
      If --fail-on-missing is True, then the organization's not being found is
      considered a failure; otherwise, a success with no change is reported.
    
    Field Options:
      -n, --name TEXT         [REQUIRED] The name field.
      -d, --description TEXT  The description field.
    
    Global Options:
      --use-token                     Turn on Tower's token-based authentication.
                                      Set config use_token to make this permanent.
      --certificate TEXT              Path to a custom certificate file that will
                                      be used throughout the command. Overwritten
                                      by --insecure flag if set.
      --insecure                      Turn off insecure connection warnings. Set
                                      config verify_ssl to make this permanent.
      --description-on                Show description in human-formatted output.
      -v, --verbose                   Show information about requests being made.
      -f, --format [human|json|yaml|id]
                                      Output format. The "human" format is
                                      intended for humans reading output on the
                                      CLI; the "json" and "yaml" formats provide
                                      more data, and "id" echos the object id
                                      only.
      -p, --tower-password TEXT       Password to use to authenticate to Ansible
                                      Tower. This will take precedence over a
                                      password provided to `tower config`, if any.
      -u, --tower-username TEXT       Username to use to authenticate to Ansible
                                      Tower. This will take precedence over a
                                      username provided to `tower config`, if any.
      -h, --tower-host TEXT           The location of the Ansible Tower host.
                                      HTTPS is assumed as the protocol unless
                                      "http://" is explicitly provided. This will
                                      take precedence over a host provided to
                                      `tower config`, if any.
    
    Other Options:
      --help  Show this message and exit.
    

    Continuous integration with Travis CI

    There is a github webhook with Travis CI
    The syntax of the python scripts in this repository is tested automatically by Travis CI.
    The files .travis.yml at the root of this repository are used for this.

    Here’s the last build status Build Status

    Looking for more automation solutions

    https://github.com/ksator?tab=repositories
    https://gitlab.com/users/ksator/projects
    https://gist.github.com/ksator/

    Visit original content creator repository
  • zombienet-sdk

    🚧⚠️ [WIP] ZombieNet SDK ⚠️🚧

    Rust Docs

    The Vision

    This issue will track the progress of the new ZombieNet SDK.

    We want to create a new SDK for ZombieNet that allow users to build more complex use cases and interact with the network in a more flexible and programatic way.
    The SDK will provide a set of building blocks that users can combine in order to spawn and interact (test/query/etc) with the network providing a fluent api to craft different topologies and assertions to the running network. The new SDK will support the same range of providers and configurations that can be created in the current version (v1).

    We also want to continue supporting the CLI interface but should be updated to use the SDK under the hood.

    The Plan

    We plan to divide the work phases to. ensure we cover all the requirement and inside each phase in small tasks, covering one of the building blocks and the interaction between them.

    Prototype building blocks

    Prototype each building block with a clear interface and how to interact with it

    Integrate, test interactions and document

    We want to integrate the interactions for all building blocks and document the way that they work together.

    Refactor CLI and ensure backwards compatibility

    Refactor the CLI module to use the new SDK under the hood.

    ROADMAP

    Infra

    • Chaos testing, add examples and explore possibilities in native and podman provider
    • Add docker provider
    • Add nomad provider
    • Create helm chart to allow other use zombienet in k8s
    • Auth system to not use k8s users
    • Create GitHub Action and publish in NPM marketplace (Completed)
    • Rename @paritytech/zombienet npm package to zombienet. Keep all zombienet modules under @zombienet/* org (Completed)

    Internal teams

    • Add more teams (wip)

    Registry

    • Create decorators registry and allow override by paras (wip)
    • Explore how to get info from paras.

    Functional tasks

    • Add subxt integration, allow to compile/run on the fly
    • Move parser to pest (wip)
    • Detach phases and use JSON to communicate instead of paths
    • Add relative values assertions (for metrics/scripts)
    • Allow to define nodes that are not started in the launching phase and can be started by the test-runner
    • Allow to define race assertions
    • Rust integration -> Create multiples libs (crates)
    • Explore backchannel use case
    • Add support to run test agains a running network (wip)
    • Add more CLI subcommands
    • Add js/subxt snippets ready to use in assertions (e.g transfers)
    • Add XCM support in built-in assertions
    • Add ink! smart contract support
    • Add support to start from a live network (fork-off) [check subalfred]
    • Create “default configuration” – (if zombieconfig.json exists in same dir with zombienet then the config applied in it will override the default configuration of zombienet. E.G if user wants to have as default native instead of k8s he can add to

    UI

    • Create UI to create .zndls and network files.
    • Improve VSCode extension (grammar/snippets/syntax highlighting/file validations) (repo)
    • Create UI app (desktop) to run zombienet without the need of terminal.

    Visit original content creator repository

  • geo-clip

    🌎 GeoCLIP: Clip-Inspired Alignment between Locations and Images for Effective Worldwide Geo-localization

    Paper Conference

    ALT TEXT

    πŸ“ Try out our demo! Colab Demo

    Description

    GeoCLIP addresses the challenges of worldwide image geo-localization by introducing a novel CLIP-inspired approach that aligns images with geographical locations, achieving state-of-the-art results on geo-localization and GPS to vector representation on benchmark datasets (Im2GPS3k, YFCC26k, GWS15k, and the Geo-Tagged NUS-Wide Dataset). Our location encoder models the Earth as a continuous function, learning semantically rich, CLIP-aligned features that are suitable for geo-localization. Additionally, our location encoder architecture generalizes, making it suitable for use as a pre-trained GPS encoder to aid geo-aware neural architectures.

    ALT TEXT

    Method

    Similarly to OpenAI’s CLIP, GeoCLIP is trained contrastively by matching Image-GPS pairs. By using the MP-16 dataset, composed of 4.7M Images taken across the globe, GeoCLIP learns distinctive visual features associated with different locations on earth.

    🚧 Repo Under Construction πŸ”¨

    πŸ“Ž Getting Started: API

    You can install GeoCLIP’s module using pip:

    pip install geoclip
    

    or directly from source:

    git clone https://github.com/VicenteVivan/geo-clip
    cd geo-clip
    python setup.py install
    

    πŸ—ΊοΈπŸ“ Worldwide Image Geolocalization

    ALT TEXT

    Usage: GeoCLIP Inference

    import torch
    from geoclip import GeoCLIP
    
    model = GeoCLIP()
    
    image_path = "image.png"
    
    top_pred_gps, top_pred_prob = model.predict(image_path, top_k=5)
    
    print("Top 5 GPS Predictions")
    print("=====================")
    for i in range(5):
        lat, lon = top_pred_gps[i]
        print(f"Prediction {i+1}: ({lat:.6f}, {lon:.6f})")
        print(f"Probability: {top_pred_prob[i]:.6f}")
        print("")

    🌐 Worldwide GPS Embeddings

    In our paper, we show that once trained, our location encoder can assist other geo-aware neural architectures. Specifically, we explore our location encoder’s ability to improve multi-class classification accuracy. We achieved state-of-the-art results on the Geo-Tagged NUS-Wide Dataset by concatenating GPS features from our pre-trained location encoder with an image’s visual features. Additionally, we found that the GPS features learned by our location encoder, even without extra information, are effective for geo-aware image classification, achieving state-of-the-art performance in the GPS-only multi-class classification task on the same dataset.

    ALT TEXT

    Usage: Pre-Trained Location Encoder

    import torch
    from geoclip import LocationEncoder
    
    gps_encoder = LocationEncoder()
    
    gps_data = torch.Tensor([[40.7128, -74.0060], [34.0522, -118.2437]])  # NYC and LA in lat, lon
    gps_embeddings = gps_encoder(gps_data)
    print(gps_embeddings.shape) # (2, 512)

    Acknowledgments

    This project incorporates code from Joshua M. Long’s Random Fourier Features Pytorch. For the original source, visit here.

    Citation

    If you find GeoCLIP beneficial for your research, please consider citing us with the following BibTeX entry:

    @inproceedings{geoclip,
      title={GeoCLIP: Clip-Inspired Alignment between Locations and Images for Effective Worldwide Geo-localization},
      author={Vivanco, Vicente and Nayak, Gaurav Kumar and Shah, Mubarak},
      booktitle={Advances in Neural Information Processing Systems},
      year={2023}
    }
    
    Visit original content creator repository