VIRTUAL DAN

VIRTUAL DAN

Notes from my travels around the internet

VIRTUAL DAN
  • My Pacific Northwest Solar
  • About

Why I Finally Bought Crypto

I finally broke down and bought some crypto currency. While I was never in the camp that crypto was a fad, I was never a believer enough to actually buy any. What finally tipped the scale for me was seeing all this big money going into bitcoin. More and more big Wall Street money managers are hedging using crypto, and soon you will see ETF’s available which will make it much more easy to buy and liquid. So finally, I bought a tiny enough so at least I can get a better feel for where crypto fits in my investment portfolio.

First I had to decide on a crypto coin. I finally decided to buy Ethereum – not Bitcoin.

I bought Ethereum not because I know what I am doing, I just weighed a few factors:

  • First, I wanted a coin with one of the top market caps. There are a zillion different crypto currencies, and the only ones that are valuable are the ones that people think are valuable. The bigger a market cap gets, the more it represents trust. Here is a chart of the top 10 cryptos ranked by Market Cap:
  • Second, I noticed that non-fungible tokens (NFT)’s primarily trade in Ethereum. Now that NFT’s are here – crypto currencies are starting to look pretty mainstream. I am not a believer in NFT’s enough to consider buying any, but I do think its an interesting concept – more on that in a future post.
  • Bitcoin gets a bad rap because of the energy it takes to process transactions, and indeed it is much more energy heavy:

I assume if Ethereum reaches the scale of bitcoin the energy usage will rise to match, but Ethereum does have on its technology roadmap a plan to make it much more energy efficient. While Ethereum has not performed as well as Bitcoin as seen by the chart above, it tracks it close enough for me until I have a better idea of what my strategy is.

Note also from the chart above how both Bitcoin and Ethereum have performed as compared with Gold (GLD). One would think Gold would somewhat match the chart of crypto, but it really hasn’t. I have a small amount of gold stocks I own as a hedge on the stock market. I can foresee wanting to use Crypto as a hedge on my gold hedge. I can easily see the rise of crypto and NFT’s as negatively hurting the price of gold. While crypto and NFT’s have no intrinsic value – gold is only a physical representation of an asset that has little intrinsic value. Its a complicated world.

As far as how I bought my Ethereum, I decided to go the easy way and buy it through PayPal – it was a very simple transaction. Crypto purists would say you really want to have your own wallet and store it off the grid, but I have such a tiny amount its not worth the hassle at this point. I guess if the apocalypse comes and we are bartering crypto currency, I may regret that decision.

For now, I am pretty much in watch and learn mode. This is a crazy investing world of late, and the inflow of the younger generations into investing is really challenging some long time strategies. Crypto may just be a bubble investment, or it might be the start of something big. Only time will tell.

Cryptopia: Bitcoin, Blockchains, and the Future…
Read More
The Basics of Bitcoins and Blockchains: An Intr…
Read More
Blockchain Bubble or Revolution: The Future of …
Read More

March 17, 2021 Dan Leave a comment

The Problem With the Gamestop Revolution

Lots has been written about the recent eruption of Gamestop stock as part of the Reddit rally against the big money on Wall Street, but I thought I would do a quick post just to sum up my thoughts. As you may recall, Gamestock stock went up 1000% after a Reddit Wall Street bets board focused small retail investors to buy Gamestop because it was shorted 140%.

Gamestop Stock Price

I think the initial strategy was correct – I think the fact that GME was shorted 140% was a mistake by the big money algorithm that failed to take that into account. When the reddit member who spotted that and spurred others via the forum to buy the stock was a creative way to profit off that mistake. I think those that got in on the first few days of the price move did pretty well. However, I think most the money that was made after January 26th was made by the big money hedge funds that the reddit movement was looking to punish.

A lot of the sentiment by the retail buyers was driven by hatred of the big money on wall street, and the feeling that the market is rigged against the small investor. To that point, I think it was a success. I think this exposed some tricks the big money used to kill the movement. While maybe Robinhood (the retail broker at the center of this) did have some fiscal reason to prevent traders on its platform from buying Gamestop stock during the heights of this movement, I find it hard to believe the big money behind Robinhood did not put pressure on the broker to keep retail investors from squeezing the shorts. This also brought to light the trade ‘front-running’ which pays for the ‘no-commission’ trades. When you trade a stock on a no-commission basis, it is clear the money is made up by the clearing house not giving you the best price. I also believe the expansion of the Gamestop trade to target other shorted stocks and silver was a deliberate diversion from the Gamestop movement.

The biggest fault I have with this is that it was driven by emotion. This anger at the big money should not of let investors to ignore basic stock fundamentals. One of the earliest lessons I learned (and keep trying not to repeat) is ‘don’t fall in love with a stock’. The mantra of ‘hold at all costs’ on Reddit is a failed strategy. This emotion is what let to all the small investors who joined the movement late to give their money to the hedge funds who know more about price fundamentals. While I don’t disagree with the sentiment on Reddit, I don’t think having thousands of small investors lose money on an irrational trade is the best response.

So that’s my take. I am heartened to see the younger generations take up these activist positions, and perhaps some good ideas came of this event. And I hope these small investors learned a little about mixing emotions with investing, and how important it is in most cases to keep them separated.

February 12, 2021 Dan Leave a comment

2020 Solar Year in Review

I have just finished compiling my statistics for the 2020 solar year, and overall it was a good year. Production from my panels were down from 3591 kilowatts in 2019 to 3523 in 2020. Below is my monthly production since 2015:

In January, I noticed a couple panels quit producing, and Pacific NW Solar was able to reboot it remotely. In March I had a power inverter go out and it needed to be replaced, which Pacific NW Solar did under warrantee. I highly recommend to anyone installing a solar system to get the wireless reporting module, as that is how I detected the panels werent producing:

One of these days I will write a routine to automate the checking of production by panel, but in the meantime I just go out periodically and verify all is performing normally. Luckily the first outage was in January when production is almost non-existent anyway, and March when I am still not in peak production. Overall maintenance continues to be manageable. I am down to cleaning the panels once a year in spring (before peak production starts), and that seems to be enough to keep them producing at optimum production.

Payback Update

My current estimated payback year for this solar system is now targeted at the end of 2024. In reading thru all the incentive information, it appears 2020 was the last year for incentives for systems installed in 2015. This will take a huge chunk out of my annual revenue. I have a balance of about $1900 bucks to pay off to break even, and without the incentives the revenue my panels produce is estimated to be just over $450 a year. One positive from a production point of view is the marginal rate on electricity has now gone over $0.11 a kilowatt, so that will help the payback. Rates have risen much slower than I projected when I installed the panels, but I do anticipate rates to continue to rise.

Usage Update

In this year of working full time at home, its not surprising that I saw a big increase in usage. Our electrical usage was up 11% for the year after 3 previous years of slowing consumption. The amount of excess solar energy our panels returned to PSE (our electric utility) was also way down, indicating much more use during the day during peak production hours. Given that this work from home thing may be a permanent way of life, I don’t anticipate usage going back down to previous levels anytime soon.

I have no changes planned for my configuration, I just hope things keep humming along. As always, if you are considering an installation and have any questions for a solar system owner, feel free to leave a comment.

January 10, 2021 Dan Leave a comment

Investing Cadence

For years I have been working to eliminate emotion from investing, as I believe emotion causes bad decisions to be made. Anytime I look at an investment and decide I am holding it because I ‘hope‘ it will go up – indicates to me that emotion is entering into the decision process. In past posts I have referenced my investment model that I use to make investment decisions, and that has gone a long way towards using more analytics and less emotion when making decisions, but building an investing cadence has also helped.

Before I go too much into my investing process, let me digress and mention that some of my best investments have been made not using analytics, but using more thoughtful reasoning. Probably my two best stock investments ever were Microsoft in the early 90’s, and Amazon in the late 2000’s. Both of these stocks were ridiculously overpriced at the time and the analytics I used at that time would not support those investments. But I took a ‘gut level’ flyer on those because I reasoned out the long term trajectory of these companies (which my analytics tend to ignore) and took a flyer on these companies. I am sure there are examples of where this approach has burned me (and I conveniently erased those mistakes from memory), but I do think there are times where you have to look outside of analytics.

Over the years I have built a routine that helps me unemotionally make investment decisions that I share below – reasoning without emotion is still a critical factor.

My investment model attempts to predict which stocks will outperform the market in the following month. So I purposely rigged the model to unveil the next months predictions on the 20th of every month. So starting on the 20th, I look at my holdings, and compare them to projections, and make buy sell hold decisions to make over the next 10 days. I like to have my changes settled by the first of the month, so my portfolios reflect on the first of the month my plan for that month. This is important because at the start of the month I programmatically take a snapshot of my portfolios for performance measurement purposes. I try to limit my trades between the 1st and the 20th so that I can easily measure my performance metrics.

I think it is a positive to get away from trading for 20 days each month. I typically spend time during those periods researching investments, or making improvements to my investment model, or doing more creative thinking on investing in the future.

I have to admit though, I don’t get away from portfolio management completely during this time. Typically on the weekend, I will spend some time looking over my portfolio and filling in partial positions or trimming positions that might be too big. I try not to ever add new stock positions or sell out of positions during this time, I should always do that between the 20th and the 1st. But the nice thing about making some buy/sell decisions on the weekend is the market is closed. So on Monday morning, I can revisit those weekend decisions, and see if I Monday morning me agrees with weekend me.

I am fully aware more financial planners say it would be better to spend less time looking at your investments, and just buy and hold and let them grow organically. I would also recommend that for most investors; just buy low cost mutual funds and look a them quarterly/yearly. But for me, I can sleep better at night fully understanding my investments, and being fully accountable for my investment performance. If I find my approach underperforms my mutual fund benchmarks, I think I would throw in the towel and find something else to occupy my time. But if you are like me, and want to manage your investments, give some though to a routine that helps drive analytical decision making, and make sure you measure your performance.

Investing in Mortgage-Backed and Asset-Backed Securities…
Read More
Unicorn Analytics: Investing (App)
Read More
Derivatives Analytics with Python: Data Analysis, Models…
Read More
December 27, 2020 Dan Leave a comment

Blazor Authentication Continued

Previously I posted a note on Blazor authentication and how my approach worked great. As it turns out, I think that approach was wrong and it left me at a deadend. So I thought it would be worth going deeper into what I am trying to do, and how I think my new approach will work better.

When I first design the layout and pattern for my Blazor application, I thought it was pretty straightforward. I wanted a login status bar at the top of the page showing the current logged in user, or if no usr is logged in a login button. I put my loginbar control directly in the main layout page to keep all my other pages from having to deal with it.

The problem occurred when I built a user dashboard page that I wanted to function where when that page is requested, and no one is logged in, it pops open a modal form (using Blazored Modal) and prompts the user to login. After messing with it a bit, I could get the user dashboard page to refresh after login, but could not get the login bar to update when the user logs in via this method.

After much googling and tinkering, I finally decided to deal with creating custom events to notify the controls of a login. I had been avoiding this – after working with javascript events I swore off dealing with that level of complexity on the front end. But after reading the docs, it appeared that this will be a useful tool for any Blazor project. I was really getting stymied until I stumbled upon Jason Watmore’s example on Github. That is a nice simple example of communicating between components.

I was able to adapt the example to my scenario, and it turns out to be a nice simple solution. Here is how it works. I created a ‘OnLoginComplete’ event which my login form fires when there is a successful login. I added a listener for this event to the loginbar and my userdashboard page, so when the event is fired these components wake up and refresh themselves, and reflect the username and other data important to the logged in user. Once I got the hang of it, it was quite simple.

I thought about creating an ‘OnLogout’ event, but it turns out I don’t need it. For now the only place I have a logout button is in the loginbar at the top of the page, and when that is clicked I destroy the credentials and redirect back to the home page.

After many attempts I feel like I have the right solution, and I think firing and listening for events will be handy for future projects. I took a lot of wrong turns in arriving at this solution.. but then isn’t that part of the fun of learning new languages?

ASP.NET Core Blazor 3.0/3.1: Blazor Server und Blazor …
Read More
ASP.NET Core Blazor 3.1/3.2: Blazor Server und Blazor …
Read More
Logitech C270 Desktop or Laptop Webcam, HD 720p Widesc…
Read More
December 9, 2020 Dan Leave a comment

Zillow vs Redfin

I just completed an analysis of Zillow and Redfin, two companies that are competing to disrupt the real estate space. Valuing these companies presents a challenge, because neither currently are consistently profitable and are bloating their balance sheets with housing inventory. So I looked at visitor stats, which did provide an interesting way to compare the value of these companies. Both companies sell at a very high premium to the market, but I do believe long term both companies could be in interesting investment.

You can view the complete analysis in the article I posted on Seeking Alpha – https://seekingalpha.com/article/4391067-zillow-vs-redfin-valuation-comparison-using-visitor-metrics

Real Estate & More! (App)
Read More
Zillow Talk (Paperback)
Read More
Realtor Coffee Mug – Realtor Gifts for Women Real Estate Agent Mug for Broker Salesman Employees – Please Don’t Confuse Your Zillow Search to My Real Estate Licence
Read More
November 23, 2020 Dan Leave a comment

Building a Prediction Model Pattern

I have been building prediction modeling applications for years as a investor, as a way to try to identify when the various asset classes or particular stocks may be over or under priced. My current model is over 15 years old, and as you might guess is becoming a huge mess of code-spaghetti which is becoming difficult to modify.

Recently, I stumbled across a full suite of college football data, and started to wonder if one could build a model to predict college football games. Rather than try to copy my existing investment model, I decided to mentally start from scratch and figure out the best way to design predictive models for maintainability. I now have a college football game prediction model up and running, using my new pattern I designed during this process:

Now this might be Data Science 101 to a data scientist, but this is not my area of expertise. My software suite is a SQL Server database and C#, tools I am very comfortable with. Rather than learn new tools and software specially built for data modeling, I thought it would be more interesting to design my own custom approach. I am a software developer, so my thinking how how to build this process was inspired my Model/View/Controller (MVC), a software design pattern that focuses on separation of logic for interconnected systems. So taking this foundation, I have broken the process of setting up an managing the model into 4 main components.

  1. Create Program to Load Data. Before I build a model, I have to make sure I have access to the data necessary to power it. There are plenty of great API’s to gather investment data, and if necessary data can be gathered via data scraping. I have a good library of tools to call APIs, and a nice suite of data scraping tools. So building the logic usually takes some time, but the logic to gather the data can be nicely compartmentalized for easy maintenance.
  2. Create Program to Regression Test Various Assumptions. Before building the program, you have to define a rough set of assumptions as to the cause and effects of various factors. The set of assumptions you create can only be limited by the data you have available. For example, for my College Football prediction model one assumption I tested was that a team is more valuable after a big home game loss. The assumption is the team might be more motivated to do well following a bad home loss, and potential betters are soured on the team. So you look at the data you have, then create various assumptions you can test against the data. Once you have a set of assumptions, you create a program to fire the assumptions at your prediction engine with varying the weight of each assumption each run. Doing this you hopefully identify assumptions that have no correlation to future performance, and ones that have a strong correlation or inverse correlation to future performance. Below I have expanded on how the prediction engine is built, as it is a core piece of the program.
  3. Create Program to calculate the ‘best’ predictions. Once you have tested various factors against your historical data, choose the factors and weightings of each factor that performed best of all the factor combinations you fired at at the prediction engine. This will be what generates the predictions, then looks at the current price (or the current betting line in the case of my college football model), and determine the ‘best’ value prediction. Note that I plan to rerun my regression tests on this model quarterly, so that I can see how well the assumption weightings are holding up. If some start to deteriorate, I may adjust factors and weightings as appropriate.
  4. Create Program to track predictions and update results. I think this is perhaps the most important piece. The prediction engine bases it’s prediction based on past data, so it is important to see if past data accurately predicts future results. So for example for the college football predictions, every Monday I run a job that updates the weekend scores, then compares the results to my predictions for the week. Each week I will look closer at the losses, to see what I missed, and maybe give me some ideas for additional factors to add. Of course, new factors may mean collecting more data, which further adds to the effort of building and maintaining the model. It is a very iterative process, as optimizations can always be made.

The Prediction Engine

Building the prediction engine is an iterative process in itself. The plan is to start small, then slowly add additional calculations over time. As long as additions are managed in an organized manner, the code base should be maintainable even after adding a large number of factors. The prediction engine (described in the big square in the diagram above) consists of 3 major parts.

a. Build Objects. The first thing to do when firing up the prediction engine is to pull the data stored in the database into a view model that exposes the data in a way to be easily accessible. These are typically complex objects that represent the entity you are making a prediction on (i.e. football game, a stock market security, asset class, etc.). For instance, a college football model would pull in a game object, which would have two teams attached to it with all the statistics and history needed for each team. For instance, a ‘bad previous week home team loss factor’ will require looking at past game performance in order to see if the a team had a bad loss in the previous week. As long as the data is there, that is a fairly simple subroutine to write.

b. Generate Predicted Value. Now that you have your data accessible – fire your list of assumption factors and weightings to calculate a value. To simplify the architecture of this, I have a separate subroutine for each factor calculation to try to avoid my logic bloat. This will allow me to isolate factors, and add new ones or delete invalid ones as necessary.

c. Generate Recommended action. Once you have calculated the value of all your assumptions against an object, you should have a score for that object. That score can then be compared to the price of the object to see if there is any action to be taken. For example, take a college football game, and given your assumptions and the data available step b came up with a calculation that the home team should win by 3 points. If the betting line has the home team favored by 14, and your threshold for action is a 7 point differential, then the recommendation action would be to place a bet on the visiting team. The same works for a stock market security. If step b calculates a stock price of $15, and the stock is priced at $10 the recommended action might be to buy the stock.

Note that it is also valuable to track the variability of the model in the form of standard deviation or R value. Some models may show a coorelation, but have a wide deviation. These deviations will help you set your ‘time to take action’ price. Typically the wider the deviation, the higher I set my action price.

Breaking the logic for this prediction engine into segmented parts should really help the management of the logic. In addition, I have a pretty good library of reusable logic components that I should be able to apply across multiple predictive models. My goal here is to slowly increase the size and scope of the calculations, while keeping the overall system pretty simple.

Now that I have my college football predictive model working, I will just continue to add assumptions to see if I can continue to increase the accuracy of my predictions. Then I will start tearing out components of my existing investment prediction engine, and rebuild it using this new model.

When will I be done with this project? Hopefully never. If all goes well, these models should be continually evolving and growing as more data is collected, and hopefully become more accurate.

Price-Forecasting Models for ProShares UltraPro QQQ TQQQ Stock (…
Read More
Price-Forecasting Models for Apple Inc AAPL Stock (Kindle Editi…
Read More
Price-Forecasting Models for Cisco Systems Inc CSCO Stock (DOW …
Read More
November 13, 2020 Dan Leave a comment

Blazor and WordPress – The Story Continues

Recently I got a comment on my first Blazor and WordPress blog post that my demo wasn’t working. I hadn’t looked at it in months, but sure enough my weather forecast component wasn’t rendering. Blazor was working, but it was defaulting to ‘route not found’ condition.

I have learned a bit more since my original experiment, so I dug back into my demo and made some improvements.

In my first version, I used the router, and set my @page directive to match the url of the blog post.

For some reason the router was no longer recognizing that route. I assume some wordpress upgrade messed with the url rewriter, but I didn’t look into that closely – I decided that was the wrong approach. Instead, I just updated the router (my app.razor file) to rely on the notfound condition:

Now my app has no routes defined, and I rely on my defaultlayout.razor page to always render. Note that I can likely get rid of the <Found condition in the router – I didnt only because it was working and I didn’t want to mess with it anymore.

So my DefaultLayout.Razor file is pretty simple:

All it essentially does is render my weather forecast component. I think this is a much better pattern to use when embedding Blazor in WordPress. If you do need to deal with routing, my recommendation would be to either sniff the url in your blazor app and decide what to do, or just control the state of your components by setting variables that cause components to either show or hide.

A couple other thoughts:

  • You may have to add mime types to your WordPress server if your app doesn’t work or you get console errors. Take a look at the web.config that gets generated when you publish, and you will see the ones you need.  If you manage the wordpress hosting server you can add them in on your admin panel.  If not, there are some wordpress plugins that allow you to manage mimetypes.
  • I am beginning to wonder if embedding blazor in wordpress is the right architecture.  The other option is build a blazor app that calls the wordpress API – and just have it pull all the wordpress content into your standalone blazor app.  That way you can still maintain the content in wordpress, but you have the full flexibility of blazor routing. If you don’t change your themes a lot, and you don’t require a lot of plugins, this approach might be better.  Just a thought.  SEO would be an issue in this approach though, since search engines don’t appear to index Blazor apps.

For reference, below is the working example of the Blazor rendered weather forecast:

Loading…

I am surprised there is not more chatter on the internet about integrating WordPress and Blazor – its a pretty interesting solution to quickly adding components to a WordPress site. If you follow the instructions from my previous post, along with the things I learned above, you can easily get it set up.

WordPress For Dummies (For Dummies (Computer/Tech)) (Paperback)
Read More
WordPress Explained: Your Step-by-Step Guide to WordPress (2020 E…
Read More
Professional WordPress: Design and Development (Paperback)
Read More
October 29, 2020 Dan Leave a comment

Adventures in Bonds

For years I have held several bond funds, from corporate to high yield and international bonds, and I have been fairly satisfied with that low risk, low return portion of the portfolio.

As mentioned in a previous post, I am re-evaluating my allocation to bonds and bond funds, but I have also been experimenting with buying bonds directly rather than funds. It has been an interesting experience, and I thought it would be worth recapping a few points below.

  1. High Yield Bonds. For years I have owned the Vanguard High Yield bond fund, which is probably the lowest cost high yield fund out there. Earlier this year, I reduced my allocation in high yield bonds, primarily because they tend to correlate more to the stock market than the bond market. Since my objective to buying bonds is to diversify from the stock market, that kind of defeats the purpose. However, as rates have fallen further and further, I decided to switch some of my corporate low-yield bond allocation to high yield, and take control of buying the bonds myself. The hope is I can get a little higher yield than just regular bonds, thus keeping the correlation with the stock market lower. High yield bond funds typically have a huge allocation to energy stocks, which in this environment I also want to stay away from. So I have been building a portfolio of reasonable quality short term maturities that are technically high yield, but I expect will behave more like bonds. I have bought bonds issued by companies such as T-Mobile, Netflix, Best Buy, and Lennar Corp which I feel pretty comfortable owning. Of course, I sacrifice yield for this, but I do feel better owning bonds in companies I understand.
  2. TIPS. I can’t do a post on bonds without mentioning TIPS – Treasury Inflation protected securities. TIPS are issued by the Federal government and the interest rate is pegged to the inflation rate. These rates naturally are very low.. but then what isn’t? My recommendation for anybody considering TIPs is to make sure and buy I-Bonds from treasurydirect.gov first. The government limits the purchase of I-Bonds to $10,000 per Social security number. The current composite rate for I-Bonds is 1.06% which in most worlds is terrible. However, you can withdraw them penalty free after 5 years (so effectively a 5 year maturity), plus you get inflation protection. In past years there has been a fixed component to the rate, so it was even a better deal, but the current rate is still above a 5 year CD rate, and if inflation increases, you will do even better. And you know the Fed is trying everything it can to increase inflation.
  3. Foreign Bonds. I have foreign bond funds, and I was considering starting to allocate to owning individual foreign bonds. So I started by looking at the holdings in the Vanguard Emerging Market Bond ETF. This fund has a yield of 3.89%, which is nice in this environment. However, the average maturity is pretty high at 13+ years, which is way longer than I would typically buy a bond for (who knows what he world will look like in 13 years?). It then occurred to me that I have no strong feeling about which countries are over or under priced. I also looked at the market value they had listed for some of their bonds, and looked at the price I could buy the bond for, and in most cases their listed price was about 1% cheaper than I can buy the bond for. I am pretty leery about bond spreads, especially when rates are so low – a 1% spread is in many cases one years worth of interest. So for now, I think I will continue to buy foreign bonds thru funds, as I don’t think I can outperform the experts by 1%.

My foray into holding individual bonds has been interesting, and it makes me feel better when I understand what I own. That is probably the biggest benefit to buying bonds outright. Even if I was a bond wizard I don’t think I can win big holding bonds vs funds. The biggest benefit is probably education and learning about how the bond market works.

Investing in Bonds For Dummies (Paperback)
Read More
How I Trade and Invest in Stocks and Bonds (Paperback)
Read More
Park Avenue: Money, Power & The American Dream (Prime Video)
Read More
October 15, 2020 Dan Leave a comment

My Custom Blazor Authentication

I have been building out my first Blazor application, and have been figuring out the pattern I want to follow for user authentication. Rather than use the built in Blazor authentication components, I am using the libraries I have built over the years, and I can plug into almost any app without having to think.

Where I got tripped up a bit with Blazor was in communicating the logged in an logged out status across components. After lots of Googling and experimenting, here is what I came up with:

In my Blazor Mainlayout page, I embedded the login bar component directly. This login bar just shows a login button, or if the user is logged in shows a logout button and the users name. When the signin button is clicked, the logon bar launches a login form component to collect username and password, and call the authentication API with the credentials. When a login is successful, the login bar updates a singleton class applicationvars.cs that holds all the profile information. This singleton allows all pages in the application to access these values. This is a nice, compact, and elegant way to handle what we used to call global and session variables – and is rapidly becoming one of my favorite things about Blazor.

Where I got tripped up was when I wanted to enforce a logout, or deal with a user session expired situation and message it appropriately. I finally settled on the solution of adding multiple routes to the same index page. I built route called ‘/logout’ and ‘/expired’, and added that route to the index page:

Where I erred was I put the logic to destroy the credentials in applicationvars inside the index page, which renders after the login bar, so the login bar didn’t reflect the credentials change. I futzed around with trying to get the index page to fire an event when the logout is detected, and have the login bar listen for the event and update appropriately. The better solution was to have the loginbar sniff the url, and when it sees the logout or expired URL, it destroys the credentials and redirects to the appropriate index URL. It was a much more elegant solution than getting the event wired up. It makes me wonder – maybe a new rule I need to follow: for your average business application, if you have to fire an event, you are probably doing something wrong in your design.

This design is working, and so far it has nicely isolated the authentication logic from the rest of the application. One worry is there is no easy way to have the app communicate with the login bar, so I may have isolated it too much. In recent various Blazor blog posts, I have noticed people are moving away from the singleton approach in favor of wrapping application vars around the whole application in the router. This new technique doesn’t change the over concept, so I may adapt that pattern in my next project. But for now, I will proceed with this pattern, and see how many times this paints me into a corner I don’t want to be in.

Pro ASP.NET Core 3 (Develop Cloud-Read…
Read More
An Atypical ASP.NET Core 5 Design Patt…
Read More
Implementing Repositories (Prime Video…
Read More
September 21, 2020 Dan Leave a comment

Posts navigation

1 2 … 34 Next →

Categories

  • Investing
  • Other Stuff
  • Politics
  • Technology
  • Uncategorized

Archives

  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
WEBSITE DISCLAIMER: The operator of this site (Vertical Financial Systems, Inc) are not registered investment advisers, broker/dealers, or research analysts/organizations. The content on this website is issued solely for information purposes and should not to be construed as an offer to buy, sell, or trade in any way, any security mentioned herein. All information presented on this website is believed to be reliable and written in good faith, but no representation or warranty, expressed or implied is made as to their accuracy, completeness or correctness. You are responsible for doing your own research before investing in any securities mentioned herein. Readers are urged to consult with their own independent financial advisors with respect to any investment. Neither Vertical Financial Systems, Inc, nor its officers or employees accept any liability whatsoever for any direct or consequential loss arising from any use of information on this website.
Full Disclosure: As an Amazon Associate I earn from qualifying purchases
Powered by WordPress | theme SG Simple