I have just finished compiling my statistics for the 2020 solar year, and overall it was a good year. Production from my panels were down from 3591 kilowatts in 2019 to 3523 in 2020. Below is my monthly production since 2015:
In January, I noticed a couple panels quit producing, and Pacific NW Solar was able to reboot it remotely. In March I had a power inverter go out and it needed to be replaced, which Pacific NW Solar did under warrantee. I highly recommend to anyone installing a solar system to get the wireless reporting module, as that is how I detected the panels werent producing:
One of these days I will write a routine to automate the checking of production by panel, but in the meantime I just go out periodically and verify all is performing normally. Luckily the first outage was in January when production is almost non-existent anyway, and March when I am still not in peak production. Overall maintenance continues to be manageable. I am down to cleaning the panels once a year in spring (before peak production starts), and that seems to be enough to keep them producing at optimum production.
Payback Update
My current estimated payback year for this solar system is now targeted at the end of 2024. In reading thru all the incentive information, it appears 2020 was the last year for incentives for systems installed in 2015. This will take a huge chunk out of my annual revenue. I have a balance of about $1900 bucks to pay off to break even, and without the incentives the revenue my panels produce is estimated to be just over $450 a year. One positive from a production point of view is the marginal rate on electricity has now gone over $0.11 a kilowatt, so that will help the payback. Rates have risen much slower than I projected when I installed the panels, but I do anticipate rates to continue to rise.
Usage Update
In this year of working full time at home, its not surprising that I saw a big increase in usage. Our electrical usage was up 11% for the year after 3 previous years of slowing consumption. The amount of excess solar energy our panels returned to PSE (our electric utility) was also way down, indicating much more use during the day during peak production hours. Given that this work from home thing may be a permanent way of life, I don’t anticipate usage going back down to previous levels anytime soon.
I have no changes planned for my configuration, I just hope things keep humming along. As always, if you are considering an installation and have any questions for a solar system owner, feel free to leave a comment.
For years I have been working to eliminate emotion from investing, as I believe emotion causes bad decisions to be made. Anytime I look at an investment and decide I am holding it because I ‘hope‘ it will go up – indicates to me that emotion is entering into the decision process. In past posts I have referenced my investment model that I use to make investment decisions, and that has gone a long way towards using more analytics and less emotion when making decisions, but building an investing cadence has also helped.
Before I go too much into my investing process, let me digress and mention that some of my best investments have been made not using analytics, but using more thoughtful reasoning. Probably my two best stock investments ever were Microsoft in the early 90’s, and Amazon in the late 2000’s. Both of these stocks were ridiculously overpriced at the time and the analytics I used at that time would not support those investments. But I took a ‘gut level’ flyer on those because I reasoned out the long term trajectory of these companies (which my analytics tend to ignore) and took a flyer on these companies. I am sure there are examples of where this approach has burned me (and I conveniently erased those mistakes from memory), but I do think there are times where you have to look outside of analytics.
Over the years I have built a routine that helps me unemotionally make investment decisions that I share below – reasoning without emotion is still a critical factor.
My investment model attempts to predict which stocks will outperform the market in the following month. So I purposely rigged the model to unveil the next months predictions on the 20th of every month. So starting on the 20th, I look at my holdings, and compare them to projections, and make buy sell hold decisions to make over the next 10 days. I like to have my changes settled by the first of the month, so my portfolios reflect on the first of the month my plan for that month. This is important because at the start of the month I programmatically take a snapshot of my portfolios for performance measurement purposes. I try to limit my trades between the 1st and the 20th so that I can easily measure my performance metrics.
I think it is a positive to get away from trading for 20 days each month. I typically spend time during those periods researching investments, or making improvements to my investment model, or doing more creative thinking on investing in the future.
I have to admit though, I don’t get away from portfolio management completely during this time. Typically on the weekend, I will spend some time looking over my portfolio and filling in partial positions or trimming positions that might be too big. I try not to ever add new stock positions or sell out of positions during this time, I should always do that between the 20th and the 1st. But the nice thing about making some buy/sell decisions on the weekend is the market is closed. So on Monday morning, I can revisit those weekend decisions, and see if I Monday morning me agrees with weekend me.
I am fully aware more financial planners say it would be better to spend less time looking at your investments, and just buy and hold and let them grow organically. I would also recommend that for most investors; just buy low cost mutual funds and look a them quarterly/yearly. But for me, I can sleep better at night fully understanding my investments, and being fully accountable for my investment performance. If I find my approach underperforms my mutual fund benchmarks, I think I would throw in the towel and find something else to occupy my time. But if you are like me, and want to manage your investments, give some though to a routine that helps drive analytical decision making, and make sure you measure your performance.
Previously I posted a note on Blazor authentication and how my approach worked great. As it turns out, I think that approach was wrong and it left me at a deadend. So I thought it would be worth going deeper into what I am trying to do, and how I think my new approach will work better.
When I first design the layout and pattern for my Blazor application, I thought it was pretty straightforward. I wanted a login status bar at the top of the page showing the current logged in user, or if no usr is logged in a login button. I put my loginbar control directly in the main layout page to keep all my other pages from having to deal with it.
The problem occurred when I built a user dashboard page that I wanted to function where when that page is requested, and no one is logged in, it pops open a modal form (using Blazored Modal) and prompts the user to login. After messing with it a bit, I could get the user dashboard page to refresh after login, but could not get the login bar to update when the user logs in via this method.
After much googling and tinkering, I finally decided to deal with creating custom events to notify the controls of a login. I had been avoiding this – after working with javascript events I swore off dealing with that level of complexity on the front end. But after reading the docs, it appeared that this will be a useful tool for any Blazor project. I was really getting stymied until I stumbled upon Jason Watmore’s example on Github. That is a nice simple example of communicating between components.
I was able to adapt the example to my scenario, and it turns out to be a nice simple solution. Here is how it works. I created a ‘OnLoginComplete’ event which my login form fires when there is a successful login. I added a listener for this event to the loginbar and my userdashboard page, so when the event is fired these components wake up and refresh themselves, and reflect the username and other data important to the logged in user. Once I got the hang of it, it was quite simple.
I thought about creating an ‘OnLogout’ event, but it turns out I don’t need it. For now the only place I have a logout button is in the loginbar at the top of the page, and when that is clicked I destroy the credentials and redirect back to the home page.
After many attempts I feel like I have the right solution, and I think firing and listening for events will be handy for future projects. I took a lot of wrong turns in arriving at this solution.. but then isn’t that part of the fun of learning new languages?
I just completed an analysis of Zillow and Redfin, two companies that are competing to disrupt the real estate space. Valuing these companies presents a challenge, because neither currently are consistently profitable and are bloating their balance sheets with housing inventory. So I looked at visitor stats, which did provide an interesting way to compare the value of these companies. Both companies sell at a very high premium to the market, but I do believe long term both companies could be in interesting investment.
I have been building prediction modeling applications for years as a investor, as a way to try to identify when the various asset classes or particular stocks may be over or under priced. My current model is over 15 years old, and as you might guess is becoming a huge mess of code-spaghetti which is becoming difficult to modify.
Recently, I stumbled across a full suite of college football data, and started to wonder if one could build a model to predict college football games. Rather than try to copy my existing investment model, I decided to mentally start from scratch and figure out the best way to design predictive models for maintainability. I now have a college football game prediction model up and running, using my new pattern I designed during this process:
Now this might be Data Science 101 to a data scientist, but this is not my area of expertise. My software suite is a SQL Server database and C#, tools I am very comfortable with. Rather than learn new tools and software specially built for data modeling, I thought it would be more interesting to design my own custom approach. I am a software developer, so my thinking how how to build this process was inspired my Model/View/Controller (MVC), a software design pattern that focuses on separation of logic for interconnected systems. So taking this foundation, I have broken the process of setting up an managing the model into 4 main components.
Create Program to Load Data. Before I build a model, I have to make sure I have access to the data necessary to power it. There are plenty of great API’s to gather investment data, and if necessary data can be gathered via data scraping. I have a good library of tools to call APIs, and a nice suite of data scraping tools. So building the logic usually takes some time, but the logic to gather the data can be nicely compartmentalized for easy maintenance.
Create Program to Regression Test Various Assumptions. Before building the program, you have to define a rough set of assumptions as to the cause and effects of various factors. The set of assumptions you create can only be limited by the data you have available. For example, for my College Football prediction model one assumption I tested was that a team is more valuable after a big home game loss. The assumption is the team might be more motivated to do well following a bad home loss, and potential betters are soured on the team. So you look at the data you have, then create various assumptions you can test against the data. Once you have a set of assumptions, you create a program to fire the assumptions at your prediction engine with varying the weight of each assumption each run. Doing this you hopefully identify assumptions that have no correlation to future performance, and ones that have a strong correlation or inverse correlation to future performance. Below I have expanded on how the prediction engine is built, as it is a core piece of the program.
Create Program to calculate the ‘best’ predictions. Once you have tested various factors against your historical data, choose the factors and weightings of each factor that performed best of all the factor combinations you fired at at the prediction engine. This will be what generates the predictions, then looks at the current price (or the current betting line in the case of my college football model), and determine the ‘best’ value prediction. Note that I plan to rerun my regression tests on this model quarterly, so that I can see how well the assumption weightings are holding up. If some start to deteriorate, I may adjust factors and weightings as appropriate.
Create Program to track predictions and update results. I think this is perhaps the most important piece. The prediction engine bases it’s prediction based on past data, so it is important to see if past data accurately predicts future results. So for example for the college football predictions, every Monday I run a job that updates the weekend scores, then compares the results to my predictions for the week. Each week I will look closer at the losses, to see what I missed, and maybe give me some ideas for additional factors to add. Of course, new factors may mean collecting more data, which further adds to the effort of building and maintaining the model. It is a very iterative process, as optimizations can always be made.
The Prediction Engine
Building the prediction engine is an iterative process in itself. The plan is to start small, then slowly add additional calculations over time. As long as additions are managed in an organized manner, the code base should be maintainable even after adding a large number of factors. The prediction engine (described in the big square in the diagram above) consists of 3 major parts.
a. Build Objects. The first thing to do when firing up the prediction engine is to pull the data stored in the database into a view model that exposes the data in a way to be easily accessible. These are typically complex objects that represent the entity you are making a prediction on (i.e. football game, a stock market security, asset class, etc.). For instance, a college football model would pull in a game object, which would have two teams attached to it with all the statistics and history needed for each team. For instance, a ‘bad previous week home team loss factor’ will require looking at past game performance in order to see if the a team had a bad loss in the previous week. As long as the data is there, that is a fairly simple subroutine to write.
b. Generate Predicted Value. Now that you have your data accessible – fire your list of assumption factors and weightings to calculate a value. To simplify the architecture of this, I have a separate subroutine for each factor calculation to try to avoid my logic bloat. This will allow me to isolate factors, and add new ones or delete invalid ones as necessary.
c. Generate Recommended action. Once you have calculated the value of all your assumptions against an object, you should have a score for that object. That score can then be compared to the price of the object to see if there is any action to be taken. For example, take a college football game, and given your assumptions and the data available step b came up with a calculation that the home team should win by 3 points. If the betting line has the home team favored by 14, and your threshold for action is a 7 point differential, then the recommendation action would be to place a bet on the visiting team. The same works for a stock market security. If step b calculates a stock price of $15, and the stock is priced at $10 the recommended action might be to buy the stock.
Note that it is also valuable to track the variability of the model in the form of standard deviation or R value. Some models may show a coorelation, but have a wide deviation. These deviations will help you set your ‘time to take action’ price. Typically the wider the deviation, the higher I set my action price.
Breaking the logic for this prediction engine into segmented parts should really help the management of the logic. In addition, I have a pretty good library of reusable logic components that I should be able to apply across multiple predictive models. My goal here is to slowly increase the size and scope of the calculations, while keeping the overall system pretty simple.
Now that I have my college football predictive model working, I will just continue to add assumptions to see if I can continue to increase the accuracy of my predictions. Then I will start tearing out components of my existing investment prediction engine, and rebuild it using this new model.
When will I be done with this project? Hopefully never. If all goes well, these models should be continually evolving and growing as more data is collected, and hopefully become more accurate.
Recently I got a comment on my first Blazor and WordPress blog post that my demo wasn’t working. I hadn’t looked at it in months, but sure enough my weather forecast component wasn’t rendering. Blazor was working, but it was defaulting to ‘route not found’ condition.
I have learned a bit more since my original experiment, so I dug back into my demo and made some improvements.
In my first version, I used the router, and set my @page directive to match the url of the blog post.
For some reason the router was no longer recognizing that route. I assume some wordpress upgrade messed with the url rewriter, but I didn’t look into that closely – I decided that was the wrong approach. Instead, I just updated the router (my app.razor file) to rely on the notfound condition:
Now my app has no routes defined, and I rely on my defaultlayout.razor page to always render. Note that I can likely get rid of the <Found condition in the router – I didnt only because it was working and I didn’t want to mess with it anymore.
So my DefaultLayout.Razor file is pretty simple:
All it essentially does is render my weather forecast component. I think this is a much better pattern to use when embedding Blazor in WordPress. If you do need to deal with routing, my recommendation would be to either sniff the url in your blazor app and decide what to do, or just control the state of your components by setting variables that cause components to either show or hide.
A couple other thoughts:
You may have to add mime types to your WordPress server if your app doesn’t work or you get console errors. Take a look at the web.config that gets generated when you publish, and you will see the ones you need. If you manage the wordpress hosting server you can add them in on your admin panel. If not, there are some wordpress plugins that allow you to manage mimetypes.
I am beginning to wonder if embedding blazor in wordpress is the right architecture. The other option is build a blazor app that calls the wordpress API – and just have it pull all the wordpress content into your standalone blazor app. That way you can still maintain the content in wordpress, but you have the full flexibility of blazor routing. If you don’t change your themes a lot, and you don’t require a lot of plugins, this approach might be better. Just a thought. SEO would be an issue in this approach though, since search engines don’t appear to index Blazor apps.
For reference, below is the working example of the Blazor rendered weather forecast:
Loading…
I am surprised there is not more chatter on the internet about integrating WordPress and Blazor – its a pretty interesting solution to quickly adding components to a WordPress site. If you follow the instructions from my previous post, along with the things I learned above, you can easily get it set up.
For years I have held several bond funds, from corporate to high yield and international bonds, and I have been fairly satisfied with that low risk, low return portion of the portfolio.
As mentioned in a previous post, I am re-evaluating my allocation to bonds and bond funds, but I have also been experimenting with buying bonds directly rather than funds. It has been an interesting experience, and I thought it would be worth recapping a few points below.
High Yield Bonds. For years I have owned the Vanguard High Yield bond fund, which is probably the lowest cost high yield fund out there. Earlier this year, I reduced my allocation in high yield bonds, primarily because they tend to correlate more to the stock market than the bond market. Since my objective to buying bonds is to diversify from the stock market, that kind of defeats the purpose. However, as rates have fallen further and further, I decided to switch some of my corporate low-yield bond allocation to high yield, and take control of buying the bonds myself. The hope is I can get a little higher yield than just regular bonds, thus keeping the correlation with the stock market lower. High yield bond funds typically have a huge allocation to energy stocks, which in this environment I also want to stay away from. So I have been building a portfolio of reasonable quality short term maturities that are technically high yield, but I expect will behave more like bonds. I have bought bonds issued by companies such as T-Mobile, Netflix, Best Buy, and Lennar Corp which I feel pretty comfortable owning. Of course, I sacrifice yield for this, but I do feel better owning bonds in companies I understand.
TIPS. I can’t do a post on bonds without mentioning TIPS – Treasury Inflation protected securities. TIPS are issued by the Federal government and the interest rate is pegged to the inflation rate. These rates naturally are very low.. but then what isn’t? My recommendation for anybody considering TIPs is to make sure and buy I-Bonds from treasurydirect.gov first. The government limits the purchase of I-Bonds to $10,000 per Social security number. The current composite rate for I-Bonds is 1.06% which in most worlds is terrible. However, you can withdraw them penalty free after 5 years (so effectively a 5 year maturity), plus you get inflation protection. In past years there has been a fixed component to the rate, so it was even a better deal, but the current rate is still above a 5 year CD rate, and if inflation increases, you will do even better. And you know the Fed is trying everything it can to increase inflation.
Foreign Bonds. I have foreign bond funds, and I was considering starting to allocate to owning individual foreign bonds. So I started by looking at the holdings in the Vanguard Emerging Market Bond ETF. This fund has a yield of 3.89%, which is nice in this environment. However, the average maturity is pretty high at 13+ years, which is way longer than I would typically buy a bond for (who knows what he world will look like in 13 years?). It then occurred to me that I have no strong feeling about which countries are over or under priced. I also looked at the market value they had listed for some of their bonds, and looked at the price I could buy the bond for, and in most cases their listed price was about 1% cheaper than I can buy the bond for. I am pretty leery about bond spreads, especially when rates are so low – a 1% spread is in many cases one years worth of interest. So for now, I think I will continue to buy foreign bonds thru funds, as I don’t think I can outperform the experts by 1%.
My foray into holding individual bonds has been interesting, and it makes me feel better when I understand what I own. That is probably the biggest benefit to buying bonds outright. Even if I was a bond wizard I don’t think I can win big holding bonds vs funds. The biggest benefit is probably education and learning about how the bond market works.
I have been building out my first Blazor application, and have been figuring out the pattern I want to follow for user authentication. Rather than use the built in Blazor authentication components, I am using the libraries I have built over the years, and I can plug into almost any app without having to think.
Where I got tripped up a bit with Blazor was in communicating the logged in an logged out status across components. After lots of Googling and experimenting, here is what I came up with:
In my Blazor Mainlayout page, I embedded the login bar component directly. This login bar just shows a login button, or if the user is logged in shows a logout button and the users name. When the signin button is clicked, the logon bar launches a login form component to collect username and password, and call the authentication API with the credentials. When a login is successful, the login bar updates a singleton class applicationvars.cs that holds all the profile information. This singleton allows all pages in the application to access these values. This is a nice, compact, and elegant way to handle what we used to call global and session variables – and is rapidly becoming one of my favorite things about Blazor.
Where I got tripped up was when I wanted to enforce a logout, or deal with a user session expired situation and message it appropriately. I finally settled on the solution of adding multiple routes to the same index page. I built route called ‘/logout’ and ‘/expired’, and added that route to the index page:
Where I erred was I put the logic to destroy the credentials in applicationvars inside the index page, which renders after the login bar, so the login bar didn’t reflect the credentials change. I futzed around with trying to get the index page to fire an event when the logout is detected, and have the login bar listen for the event and update appropriately. The better solution was to have the loginbar sniff the url, and when it sees the logout or expired URL, it destroys the credentials and redirects to the appropriate index URL. It was a much more elegant solution than getting the event wired up. It makes me wonder – maybe a new rule I need to follow: for your average business application, if you have to fire an event, you are probably doing something wrong in your design.
This design is working, and so far it has nicely isolated the authentication logic from the rest of the application. One worry is there is no easy way to have the app communicate with the login bar, so I may have isolated it too much. In recent various Blazor blog posts, I have noticed people are moving away from the singleton approach in favor of wrapping application vars around the whole application in the router. This new technique doesn’t change the over concept, so I may adapt that pattern in my next project. But for now, I will proceed with this pattern, and see how many times this paints me into a corner I don’t want to be in.
The craziness of markets over the last few months caused by the Coronavirus have been incredible. As in investor, I am trying to forecast where we go from here, and what are the best and worst possible outcomes. While the stock market and its disconnect from the economy seems to be getting most of the headlines, I think the bigger story is the bond market and where it goes from here.
As seen in the chart above, the 2 year treasury bond has dropped from near 3.00% in late 2018 to 0.13% in September of 2020. This, along with the government’s fiscal stimulus is probably most responsible for the stock market’s disconnect from the economy.
The stimulus here is unprecedented – dwarfing the financial crisis of 2009:
The stock market is fueled by stimulus, and seems to be signalling full speed ahead. Given the government intervention, it is nearly impossible to determine if the market is overvalued or undervalued, as the economy is a secondary factor to stimulus.
I also think the stock market is being fueled by Bond investors jumping ship. Risk averse bond investors not reinvesting bonds that mature at .14% – instead invest in Apple or Microsoft that has a growing yield and is ‘safe’. I understand the appeal of this strategy, but at these prices it seems like a lot or risk is being taken on out of desperation.
At any rate, one almost sure bet is that the bond market will be a loser. This bet does assume that the aversion to negative interest rates in the US will persist. Given that assumption, the best case for the investors in 2 year treasuries would be a no inflation or deflationary economy, with rates at or near zero. Given that assumption, the rate of return on bonds is still near zero.
The worst case scenario for bond investors is all this stimulus in the hands of consumers, combined with an economy waking from the COVID shutdown, leads to inflation. The only hope the government has of reducing the debt is to increase inflation, and so those in charge of the money press are incented to cause some inflation. The government has targeted and pretty much achieved a 2% inflation rate for the last few years. The Federal Reserve recently adjusted its inflation mandate to declare they may allow overshooting their 2% target rate – a further sign of inflation in the medium to long term. This does not bode well for a T-Bill yielding .14%.
So what to do with my bond portfolio. My asset allocation currently has a percentage in US Bonds, and if I am to reduce that allocation, where to I put it? Lots of options, but none that I really like:
Increase Allocation to the stock market. One option is to increase allocation to the stock market- maybe in the ‘bond proxy’ sectors. Options such as Utilities and Financials (JP Morgan is yielding over 3.6%%, and the government won’t let anything happen to that bank. Most local and regional ‘safe’ utilities yield in the 2-4 % range. Other options might be focusing on reasonable yielding low valuation stocks. Rocky Brands comes to mind with a near 2% yield and Price/Book around 1, P/E of 9, and a little growth. Many bond investors have already flocked to the market and these sectors, so it may be too late here to buy these possibly inflated assets. Definitely adding risk with this strategy.
Increase Allocation to Gold. I have an allocation in gold already, though I hate the idea of holding rocks in my portfolio. But the way the world if printing money, Gold has had a pretty good year and the money printing wont be stopping anytime soon. I have been experimenting with buying quality Gold miners, then selling covered calls in the +10% range which generates pretty good income. This is still an experimental strategy, but it seems to be doing no worse than just holding gold (via BAR) and gold mining stocks, and makes me feel better about holding this asset class.
Increase Allocation to TIPS (Treasury Inflation Protection Securities). An interesting play here – TIPS are pegged to the inflation rate – so if inflation does hit, your interest rate increases. If we have deflation, yields will turn negative (unless you buy IBonds via Treasury Direct – which have a floor of 0% rate – but there is a limit on annual purchases). Probably worth taking your chances with TIPS over fixed rate bonds, The Vanguard TIPS Mutual Fund is currently yielding over 2%, so that might be an interesting option, but its not going to make you rich. Its also moved up alot over the last few months, as other people have figured this out several months ago.
Increase Allocation to Real Estate/REITs. This is my least favorite diversification play. In volatile times, REITs move more like stocks than bonds, so its a huge risk increase to move from bonds to economy dependant real estate. There are certain areas of REITs that my be interesting for ‘safer’ diversification (i.e. farmland REITs), but historically they haven’t performed well. I also believe the fallout from COVID has yet to be reflected in the commercial REITS.
I am not alone facing these choices – most investors saving for retirement or in retirement are facing this dilemma. Right now I am leaning toward increasing my allocations to the stock market – but really emphasizing low P/E or low Price to Book stocks in defensive sectors that haven’t participated in this recent tech bubble run-up. Those stocks are out there, it just requires some digging.
I have been reading a lot of bad reviews about Microsoft’s new dual screen device called the Surface Duo. What I find most surprising is most the bad reviews center around the lack of purpose for a dual screen device. The Windows Blog provides a pretty good overview of the device.
Surface Duo
It seems to me there would be demand for a dual screen device – no so much for single app use, but for business multitasking. It seems to me the logical use case is to have one side of the phone for apps consuming media, the other side for apps where your create media. So if I was to get one, I would likely keep my email and messaging app on one side, and my web browser and reading apps on the other. Other use cases Microsoft shows that essentially using the device as a doublewide screen are less compelling to me, as even a double wide screen is probably too small to do any real work. But for a productivity tool, I think this form factor will catch on.
Besides the consternation regarding how to use the device, the bad reviews have focused around the low specs on this device. On this point I agree – for a list price of $1,400, I would want a more high end device.
The low end specifications for such a high priced device will put this product at a disadvantage, and it may not succeed. But I don’t necessarily think Microsoft was planning on a huge success in this round. This is an experiment and I think they are looking to see what users and app developers do with this form factor. Again, at $1400 this device isn’t for me, but I give Microsoft credit for putting it out there.