There is an uncountable number of potential investments and chances existing in the world. Why not get them searched and reduced to a data set that might be investigated more deep to find the “right” one?
Solution is a very tailored crawler written in Python.
The Data Model
Crawling a page it is possible to extract some relevant data.
If the mail has some relevance, but no action is needed, it might be a good idea to store it. If you really need it as a reference on the long term store it in some reference folder. If you just want to keep it because it might be useful in the next months, create a folder that automatically deletes content older than 6 months.
If action is needed, but you can delegate it, forward it to that person with a clear expectation/deadline and put it in some scheduled review folder – typically weekly reviewed.
In case it is my personal task, lets judge how long it might take. If it takes more than 2 minutes to work on the topic, lets do it in a dedicated blocked time-slot during the day for such tasks.
If action is needed by me personally and it can be done within 2 minutes – just do it and feel happy that it is done 😉
Portfolio overview: Managed using “Portfolio Performance” (Open Source, Freeware, Linux)
Find potential investments: By a regular crawl of fundamental data and providing an overviews. This is managed via a Python driven crawler. Keyword “fpi”
Portfolio management w/o emotions: Following the risk management philosophy of Markowitz chosen invests are balanced based on interest and volatility. This is managed via a MATLAB driven model. Keyword “pme”
Buy/Sell optimization: Support to predict the right buying/selling price. Keyword “bsp”