#robmartinson.me# #blog
For several years I’ve had a loose idea in my head about a 2 way content aggregator to act as sort of a blog or feed. I tend to create content in a variety of different places without a solid focus in one single area and I would like a way to capture all of that and make it useful. For example, I may post intermittently on the @themartinsonshop instagram page, or the @robmartinson twitter/X account, or the @rob.martinson threads account. I also make a ton of notes and brain dump outlines in Bear.app and sort of use that as my central information collection spot. I create meeting notes in Bear and also to a variety of different Google accounts. Some of these variety of outlets are business related, lots are personal hobby style things and many are just vents or rants in my head that may sometimes be useful to others.
Anyway, the idea is to create an online application feed aggregator that works in both directions to sort of collect and distribute data to the different places but also keep a central record of it all and allow readers (probably mostly me) consume it and filter it by its source and/or its destination or subject matter. What if I could post content on any one of the services or tools above (twitter, instagram, threads, slack, whatever) and a record of that made it back to my personal website. Likewise I could post my mini brain dumps, ideas and rants in Bear.app and by applying some known tags, it would be collected on the personal website and available to others, but also may post to one of the other spots too. I think this is sort of like what Pixelpipe used to do, but with a receiving aggregator as well.
The primary storage point for most of my personal content, notes, ideas, etc. is Bear.app so I should start there.
Goals
- Learn and experiment with different languages and technologies
- Write / develop content
- Create a reusable useful tool
- Push content to all of the places
- Pull content from all of the places
Misc
- Golang backend
- Blog backed by dynamo? Sqlite?
- Evernote api
- Thinkorswim api
- Bear api
- Insta feed
- What I’m listening to
Infra
- cheapest / fastest / simple and learn new stuff
- Fastapi + Jinja + sqlite
- Golang + html templates + sqlite
Deployment
- Raspberry pi (4 or 5) ?
- Hosted at the martinson shop? Limelyte?
- Replicated and/or load balanced?
- VPS?
- SQLite + Litestream?
- How to make it so cheap, free, easy
Activity aggregator
Micro blogging / activity aggregating platform. Log events of a certain type in time with tags.
Brain Dump!
- Outbound: If I post content on the site, it pushes that content to different places based on the tags it contains. #threads #martinson.shop etc
- Maybe use bear as the creation point and create an integration to push content. Can it host inbound content too? How to deal with 2 way sync?
- Outbound Channels: - some of these can be groups (like robmartinson) that flow to multiple destinations. You can mix and match, for example robmartinson (all robmartinson channels) and threads/themartinsonshop (only the martinson shop threads account). Inbound messages would be tagged with specific source locations
- robmartinson - group, goes to threads, Facebook, other?
- threads - goes to rob.martinson
- martinson.shop - goes to website? Goes to insta/threads? (This is like a group tag)
- instagram - goes to all accounts
- instagram/themartinsonshop
- instagram/rob.martinson
- Facebook - goes to all accounts
- X / Twitter
- Youtube
- Inbound: If I post content somewhere else (fb, threads, whatever), that content is brought into the site and tagged according to source.
- If an inbound content entry has relevant tags, should it be posted back to other outbound channels? In other words, if I post a tweet on robmartinson that has the #threads/robmartinson tag, should that be cross-posted to the threads account? I think maybe so. Interesting!
- Website:
- fastapi, lambda, jinja, tailwind?
- golang, html/template, bootstrap?
- Can content be stored in static formats like markdown?
- maintaining state: need to make sure we don’t push outbound multiple times. Each entry needs a hash and some backend to track whether it’s been processed or not
- First, let’s just work on inbound micro blog posts
- nav bar
- Home
- Feed
- All entries
- filter by tag or tags
- has tag(s) and/or doesn’t have tag(s)
- About?
Event Types / Connectors
- Google Docs (push from bear to google docs)
- Bear (push from bear to website, article, docs, whatever)
- Email
- Link w Gmail or others, auto index and tag to, from, cc, subject this replaces copper
- Receive a forwarded email via an inbox email address to add to timeline
- Photo
- Link w iPhoto or google stream?
- Blog entry
- Link w Wordpress or support micro blog format? This is covered by the Bear integration
- X/TWitter?
- Insta?
- Web event / log?
- Fb post?
- Audible events? What book(s) am I reading, etc.
- GPS Location / checkin
- Telegram?
- Ability to view any tag or combination of tags or all events in an interactive timeline format. This will be extremely useful to see the relationship between different events in time.
- Events can have an optional length in seconds. If an event has a length of zero it is just a marker to a point in time if it has a length longer than zero it lasted that long from inception to end
- Ability to print any timeline view in different formats. When printing, one view might be timeline with numbered events (1,2,3) and separately full event details printed per sheet at the end so to easily link event details with timeline. Also pretty print option to generate photos, etc
- Ability to manually create an event of any type
- Ability to link entries to other entries to create context
- Ability to follow a tag or set of tags
- Ability to make a tag or set of tags public or private. Explicit private overrides public when both on a same event
- Ability to edit tags later
- Ability to view tag/filter subscribers
- Ability to publish a tag/filter list to another website (as a blog or feed w rss)
- Useful for tech log, timeline of build (house, app, other), timeline of client interaction (case building, Ernesto) storytelling, blogging, writing a book, collecting research, traveling, journaling
- Activity table
- Id
- Timestamp
- Type - blog, url, Twitter, instagram, audible, podcast, sonos, YouTube publish, YouTube watch,
- Tags - used to categorize feed w type
- Content - JSON doc with all relevant data for entry
- Publish as RSS?
- Webhooks to receive data or initiate a request to get data when another system changes
- Comment system? Signing as GitHub, google, x/twitter, insta, whatever
- Ability to subscribe to a filter
- Full REST endpoint to manage content
- Urls:
- /feed - all available here
- /feed/{type} - all filtered by type
- /feed/{type}/{slug} - specific type entry or friendly url
- /feed/{guid} - specific entry
- ?tags=[one,two,three] - filtered by tags, contains all
Subscriptions / Permissions
- Each type and tag can be public or private by default. If public, anyone can see or subscribe. If private, they must be approved or be invited.
- Once a consuming / following user finds a good set of filters, they can subscribe to that filter (rss?)
Integrations
- Bear
- Search for blog tag and published tag and send to posting
- Export to md and push
- How to deal w images/assets? Push to s3
- Activity stream micro blog with manual posts or events from different services like audible, Sonos, Instagram, Facebook, Twitter, Reddit, word press, etc. etc.
- types
- blog entry
- id
- date/time
- status (published or not)
- title
- content
- tags
- url
- id
- timestamp
- url - of original
- content (cache all items from URL once posted)
- Add comments
[[The ultimate notes app and workflow]]
Created: Sunday, February 18, 1990
- Updated: Sunday, August 18, 2024