A solution to missing XML files and Swagger

You use Swagger, and include XML comment files, which are copied during build from a few different projects to the root of your API application. Then one day, you realize that the build didn’t copy one of the XML files, so this line threw an exception:

options.IncludeXmlComments(Path.Combine(AppContext.BaseDirectory, "MyProject.Domain.xml"));

Yes, you can fix the build to resolve the error, but if you treat XML comments as content (not code), you can do this:

foreach (var file in Directory.GetFiles(AppContext.BaseDirectory, "*.xml"))
{
    options.IncludeXmlComments(file);
}

It appears that Swagger doesn’t break when a non-XML comment file are included, which is beneficial. With this method, only XML files that exist are included, you don’t risk breaking your app when they are missing.

Reading appSettings from Web.config into Classic ASP

I’ve done far too much work in my career on Classic ASP. Many times, sites that were originally written in Classic ASP were to be ported to ASP.Net, and often the transition between the two happened over time, with both technologies living together in the same web site. This poses many challenges, but one is easy to resolve: global application settings.

In ASP.Net, it is common to use the appSettings section of the Web.config file to store global application values. In Classic ASP, it is common to use the Application() object to store these values, and to set these values in the global.asa file.

If we want to share these values so they do not exist in both places, we can create Application object values based on appSettings in the Web.config file.

First, let’s imagine this is our global.asa file:

Sub Application_OnStart
	Application("value1") = "hello"
	Application("value2") = "goodbye"
End Sub

And let’s imagine this is the appSettings section of our Web.config file:

<appSettings>
	<add key="value1" value="hello" />
	<add key="value2" value="goodbye" />
</appSettings>

Same values, but they exist in two places. Fortunately, the Web.config file is an XML file, and Classic ASP can read XML files. Replace your global.asa code with this:

Sub Application_OnStart
	Dim xmlDoc, xmlappSettings, xmladd
	Set xmlDoc = Server.CreateObject("Microsoft.XMLDOM")
	Set xmlappSettings = Server.CreateObject("Microsoft.XMLDOM")
	Set xmladd = Server.CreateObject("Microsoft.XMLDOM")
	xmlDoc.async = false
	xmlDoc.load(Server.MapPath("Web.config"))
	Set xmlappSettings = xmldoc.GetElementsByTagName("appSettings").Item(0) 
	Set xmladd = xmlappSettings.GetElementsByTagName("add")
	For Each addItem In xmladd
		Application(addItem.getAttribute("key")) = addItem.getAttribute("value")
	Next
	Set xmladd = Nothing
	Set xmlappSettings = Nothing
	Set xmlDoc = Nothing
End Sub

This code will read your Web.config and create corresponding Application() object items with the same key/value. Going forward, you can save your configuration entries into a Web.config file (and use config file transformations).

Keep in mind that the global.asa’s Application_OnStart only runs when the application is first started, so changes to the Web.config may not get picked up until you restart your web application.

Classic ASP may have its limitations, but creative solutions like this can solve some of them.

Ask yourself six questions before building the Next Big Thing

You have a great new idea for a great new feature or function or service. Before rushing off to create it or implement it, ask yourself…

Who is this feature/function for?
What are the other ways to solve this problem?
Where is the right place for this feature/function?
When is this important to do?
Why is this important?
How will this be support today, tomorrow, and forever?

We often build things without thinking them through. Be your own worst enemy before tackling something, so you know your future self will agree that you made the right decision (or, at least, the best decision you could make at the time).

Extending DbSet for easy filter expressions

Databases often have entities which are often by a repeated expression. A common one is filtering something by a UserId. In SQL, this looks like:

select * 
from orders
where userid = @userid;

Using Entity Framework, we may write something like this:

dataContext.Orders.Where(x => x.UserId == userId);

But I’d really like to make it more expressive and consistent, like this:

dataContext.Orders.ForUser(userId);

Fortunately, it is possible, with an interface and an extension method.

The interface, which I will call IUserEntity, will expose the common filtered expressions.

public interface IUserEntity
{
	int UserId { get; set; }
	User User { get; set; }
}

Any class that can be filtered by users should inherit this class.

public class Order : IUserEntity
{
	public int Id { get; set; }
	public int UserId { get; set; }
	public User User { get; set; } = null!;
}

Then our expression method will extend any DbSet with a type of IUserEntity to include our extension method, which simply returns a filtered DbSet.

public static class DbSetExtensions
{
	public static IQueryable<T> ForUser<T>(this DbSet<T> userEntities, int? userId) where T : class, IUserEntity
	{
		if (userId.HasValue)
		{
			return userEntities.Where(x => x.UserId == userId.Value);
		}
		return userEntities;
	}
}

Note how in the above I made the userId nullable — this is not required, but it does give you a bit more flexibility.

This can be replicated for any other common filter, which will make your code more expressive and easy to read. If you wanted similar extensions on other elements, such as Lists, you could just make a copy of the extension method for that object type, and the appropriate return types.

Use compiler directives to hide test code

Instead of this:

[HttpGet]
public string Test()
{
    return "Hello";
}

Try doing this:


#if DEBUG
[HttpGet]
public string Test()
{
    return "Hello";
}
#endif

Advantage: the code will only be used in Debug (not Release) builds. Of course, if you’re not paying attention to build types, it won’t help. But it has a shot at keeping your test code out of production.

The Internet: friend or foe?

I stumbled across an interesting article about how your digital life will follow you whether you want it to or not. (Read: I Called Off My Wedding. The Internet Will Never Forget).

My rule has always been, be mindful of what you post; assume that everyone will read it and that it will exist forever. The Internet will hold on to things that you may want to forget, or let go of, or wish never happened, making the process of moving on or moving past something much harder that it was at any other point in human existence.

In the “good ol’ days” you would tell your friends and family, “I don’t want to speak of this any more,” and that would (hopefully) be the end of it. People may gossip behind your back (how humans love to gossip), but it was, for the most part, over. The memories lived in your head, and you were able to remove (most of) the reminders in the physical world around you. Not so any more.

The Internet is the friend who hears everything you say to it, shares it with everyone it knows (and it knows billions of people), and never forgets any of it. It is an amazing friend, great for so many things, useful in so many ways, but with a potential dark side.

PING a sequence of IP addresses from the Command Prompt

Have you ever needed to find a free IP address in a given range? Or, have you ever needed to find out which IP addresses in a range are in use? Sure, you can type a bunch of PING commands and note the results. Or, you can loop it:

for /l %i in (96,1,111) do @ping -n 1 10.1.0.%i | find "bytes=" 

Simply replace the “96” and “111” with the start/end range, and “10.1.0” with the subnet of your choosing. The “1” in the range is the counter, so if you want to skip values, simply change this number. The output will only list those devices which receive a reply from the PING.

If you have multiple subnets, you can loop the loop, like this:

Or, if you want it all in a fancy batch file, paste the following into a batch file (I call it “pingall.bat”).

@echo off
if "%3" EQU ""  (
    echo pingall.bat ^<networkip^> ^<startip^> ^<endip^> [step]
    goto :eof
)
setlocal
set networkip=%1
set startip=%2
set endip=%3
set step=%4
if "%step%" EQU "" set step=1
for /l %%i in (%startip%,%step%,%endip%) do ping -n 1 %networkip%.%%i | find "bytes="
endlocal

To run this, you specify the network (class C only in this example), the start IP, the end IP, and optionally the step (which defaults to 1). Your execution will look something like this.

C:\utils>pingall.bat 10.1.2 211 215
Reply from 10.1.2.214: bytes=32 time<1ms TTL=128
Reply from 10.1.2.215: bytes=32 time=2ms TTL=64

It’s a quick way to see what’s online in an IP range — and sometimes, that’s exactly what you are looking for.

Limitations of technology and resources are not “inherently cruel”

In her commentary on speech recognition software (“Speech Recognition Tech Is Yet Another Example of Bias“, Scientific American, Oct 2020), author Claudia Lopez-Llorenda derides the limits of technology because of her need to alter her speech pattern to a non-accented version of her own voice in order to be recognized fluently. In her words, “[changing] such an integral part of an identity to be able to be recognized is inherently cruel.”

The technology community is a scientific one, subject to the same limitations of other sciences: those of resources, knowledge, and capabilities. It is also subject to the same limitations of economics: funding and supply and demand. To imply that technology (and technology companies) are ignoring smaller demographic groups or populations for reasons other than those limitations is short-sighted, and ignores the complexity of the problem and the allocation of available resources to solve it.

Speech recognition has become mainstream, and we have seen solutions delivered to market in the past ten years that were likely considered science fiction 20 years ago. It is still far from a complete solution, and that is shown by the continued rapid advances and developments in the field. Apple, Google, Amazon, and others have brought speech recognition to dozens of languages in just a few years, delivering an imperfect solution to a complex problem that consumers today expected to work without fail, much like we expect our cars to work when we turn the key.

The difference, of course, is that cars are all the same; people are all different, and even though many of us speak similarly, many of us do not, as we use dialects of the same root language — and that is where the economics of science come to play. When you want to implement language support, you will start with the baseline language, to capture the widest population of potential users. As technology improvements come, the availability to pick up smaller and smaller populations of users who speak in dialects will come with it. These are not limitations built into a solution; rather, they are limitations based on technological capabilities and available resources to implement them.

When companies decide to do this, it is not to be exclusive; rather, it is to be inclusive of as many people as possible. Turn on the television in the US and watch the news, and you will largely see newscasters speaking in a standard form of American English. This is by design; they are speaking in a way that people with nearly any dialect or understanding of spoken English can follow, thereby being inclusive of the most number of people by resorting to a common baseline. Technology companies do the same. This decision does not take away anyone’s cultural identity, nor should it be seen as “inherently cruel”.

Over the next ten years, Ms. Lopez-Llorenda will undoubtedly see incredible advances in speech recognition. As technological capabilities grow, and as company resources are freed up by completion of tasks for larger populations of users, she will see improvements in language support (including dialects) and eventually will see that speech recognition is able to recognize an individual’s speech (for example, picking up the l in salmon, for the one person I met in my life who pronounced it that way). The inherent cruelty will go away, not because anyone felt it was cruel, but simply because technology has caught up.

In the meantime, I will continue, at times, to mask my New York dialect in conversations — not because I am trying to hide my cultural identity. Rather, perhaps I am trying to be inclusive of the listener, who may be unfamiliar with such a dialect; or perhaps I don’t want to come across as a paisano — because, after all, if you heard me ordering a cup of cawfee (milk, no sugah), you probably would quickly make a certain opinion of who I am. And that opinion may be right or wrong, and you are entitled to make it, and I don’t take it as an insult. You’re merely taking in speech, and making a decision based on the limited amount of data and information you have to process it — which is, ironically, the same thing Siri is doing.