added double click on node functionality which opens up a tab with either the tweet/user page or the url page carried by that node
Worked on popups for graph explorer
Added tooltips when hovering on nodes
Shutdown streaming API from DigitalOcean
Writing down ideas for the landing page
Designed a logo for
I consider it a first version that is ok to get me going. I don't have the skills to make a beautiful logo but that can be improved later on.
Assembled the components for the graph explorer
- graphQL API
- elasticsearch for autocompletion
- react material UI
- visjs for the network graph
- elasticsearch for autocompletion
- react material UI
- visjs for the network graph
Setup react material UI as the framework for
Got a first version of the graph explorer for
And I have tons of things to do to make it easy to traverse/comprehend/query/retrieve insights about. Very excited to see things coming together :)
I cleansed wrong url ID in twitter urls from tweets or user profiles
I naively thought a URL (expanded) would have a unique shortened URL, Of course it is not the case.
So there were duplicates URL objects in the db to clean up. Now sanity is back in Bipdash's db.
So there were duplicates URL objects in the db to clean up. Now sanity is back in Bipdash's db.
Bought the domain bipdash.com
Created and loaded an Elasticsearch index for the future UI autocompletion search
Modified ETL script to create links from hashtag to tweets
Deployed and scheduled ETL script for
Completed ETL script (daily run) to load the db
Built and deploy the API
Finished my ETL script loading tweeter data in mongoDB
Finished data model and identifying attributes for
More info here https://www.notion.so/Bipdash-60d0f6a8f0df472e937a5608ca22be46
Created a Notion account and started my first notion page to track all my projects tasks
And it feels already much better than using harder to maintain Google docs!
created high level data model for
For building the graph data model, I used https://arrows.app/. Browser based graph drawer, quite handy!
Next task is to populate attributes I want to keep for nodes and relationships.
Next task is to populate attributes I want to keep for nodes and relationships.
Identified data fields + endpoints needed from Twitter API
I spent quite some time int he past 2 days figuring out how to get what I want from Twitter API v2.
Now I know how to get recent tweets (7 days old max) with related user, media information that will be needed for the knowledge graph.
Next task: I will design the data model
Now I know how to get recent tweets (7 days old max) with related user, media information that will be needed for the knowledge graph.
Next task: I will design the data model
I worked on a macro plan for
Objective
----------
Make #buildinpublic information searchable and easy to retrieve insights about. Newcomers should quickly access the highest quality resources to help them start building in public. Users already building in public could always sharpen their knowledge about building in public.
What
------
Create and update on a regular basis a knowledge base about #buildinpublic information shared on Twitter.
Build a UI with search functionality and display the searched data in a graph allowing to see relationships between Maker, Product, Conversations between makers etc...
The graph will be interactive, in the sense that clicking on nodes would expand it and allow for opening tweets or access urls of media etc...
Next Steps
--------------
Build data model from Twitter endpoints analysis
Build & deploy ETL script updating database on a daily basis
Build & deploy API service (search endpoint)
Build & deploy UI
----------
Make #buildinpublic information searchable and easy to retrieve insights about. Newcomers should quickly access the highest quality resources to help them start building in public. Users already building in public could always sharpen their knowledge about building in public.
What
------
Create and update on a regular basis a knowledge base about #buildinpublic information shared on Twitter.
Build a UI with search functionality and display the searched data in a graph allowing to see relationships between Maker, Product, Conversations between makers etc...
The graph will be interactive, in the sense that clicking on nodes would expand it and allow for opening tweets or access urls of media etc...
Next Steps
--------------
Build data model from Twitter endpoints analysis
Build & deploy ETL script updating database on a daily basis
Build & deploy API service (search endpoint)
Build & deploy UI
I am starting my first product...
The product is Bipdash (https://getmakerlog.com/products/bipdash).
And the first task was to get access to Twitter Developer API. I am actually pleased by the process, it is quite easy and we get tokens in about 5 mins.
I had to setup Twitter API access in the past for a previous job (in 2018) and it took me more than a month to get the same Developer level access to Twitter API.
So well done Twitter devs!
Next post should be about how I plan to build the app.
And the first task was to get access to Twitter Developer API. I am actually pleased by the process, it is quite easy and we get tokens in about 5 mins.
I had to setup Twitter API access in the past for a previous job (in 2018) and it took me more than a month to get the same Developer level access to Twitter API.
So well done Twitter devs!
Next post should be about how I plan to build the app.
Jonathan Barone ๐ธ
Author
@uluhonolulu I'll update the project. I find tremendous value in what people share while building in public. Bipdash would be a database consolidating all resources (people, products, tweets, media links โฆ) in the shape of a knowledge graph. A user interface would allow user to parse this information quickly and see connections between products, makers, the various relationships between resources themselves etcโฆ Collecting the data, transform it and load it on a regular basis shouldnt be a problem, but I have to figure out where to host the data without paying a consequent amount of money.
- โ Previous
- 1
- Next โบ