.Net and Asp.Net on MacOs & Mono instead of Windows (using Visual Studio or JetBrains Rider)

Mono goes a long way in running code written for .Net on Windows. It is all very much easier if you either start with cross-platform in mind, or if you move to .Net Core; but even for existing .Net Framework projects mono can run runs most things including Asp.Net.

Here's my checklist from a couple of years of opening .Net Framework solution files on a Mac and finding they don't build first time.

Most of these require you to edit the .csproj file to make it cross-platform, so a basic grasp of msbuild is very helpful.

  1. For AspNet: inside the PropertyGroup section near the top of the csproj file, add an element:

    <WebProjectOutputDir Condition="$(WebProjectOutputDir) == '' AND $(OS) == 'Unix' ">bin/</WebProjectOutputDir>

    Use this if you get a 'The “KillProcess” task was not given a value for the required parameter “ImagePath” (MSB4044)' error message; or if the build output shows you are trying to create files in an top-level absolute /bin/ path.

  2. For AspNet: Add Condition="'$OS'!='Unix'" to the reference to Microsoft.Web.Infrastructure.dll AND delete the file from the website bin directory.

    <Reference Condition="'$OS'!='Unix'" Include="Microsoft.Web.Infrastructure, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL">
    <Private>True</Private>
    <HintPath>..\packages\Microsoft.Web.Infrastructure.1.0.0.0\lib\net40\Microsoft.Web.Infrastructure.dll</HintPath>
    </Reference>
  3. For all project types—but, only if you need to use the netCore dotnet build tooling to build an NetFramework project on unix. mono's msbuild does not need this. Add this section somewhere in the csproj file (I put it right at the bottom), to resolve NetFramework4 reference paths:

    <PropertyGroup Condition="$(TargetFramework.StartsWith('net4')) and '$(OS)' == 'Unix'">
    <!-- When compiling .NET SDK 2.0 projects targeting .NET 4.x on Mono using 'dotnet build' you -->
    <!-- have to teach MSBuild where the Mono copy of the reference asssemblies is -->
    <!-- Look in the standard install locations -->
    <BaseFrameworkPathOverrideForMono Condition="'$(BaseFrameworkPathOverrideForMono)' == '' AND EXISTS('/Library/Frameworks/Mono.framework/Versions/Current/lib/mono')">/Library/Frameworks/Mono.framework/Versions/Current/lib/mono</BaseFrameworkPathOverrideForMono>
    <BaseFrameworkPathOverrideForMono Condition="'$(BaseFrameworkPathOverrideForMono)' == '' AND EXISTS('/usr/lib/mono')">/usr/lib/mono</BaseFrameworkPathOverrideForMono>
    <BaseFrameworkPathOverrideForMono Condition="'$(BaseFrameworkPathOverrideForMono)' == '' AND EXISTS('/usr/local/lib/mono')">/usr/local/lib/mono</BaseFrameworkPathOverrideForMono>
    <!-- If we found Mono reference assemblies, then use them -->
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net40'">$(BaseFrameworkPathOverrideForMono)/4.0-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net45'">$(BaseFrameworkPathOverrideForMono)/4.5-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net451'">$(BaseFrameworkPathOverrideForMono)/4.5.1-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net452'">$(BaseFrameworkPathOverrideForMono)/4.5.2-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net46'">$(BaseFrameworkPathOverrideForMono)/4.6-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net461'">$(BaseFrameworkPathOverrideForMono)/4.6.1-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net462'">$(BaseFrameworkPathOverrideForMono)/4.6.2-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net47'">$(BaseFrameworkPathOverrideForMono)/4.7-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net471'">$(BaseFrameworkPathOverrideForMono)/4.7.1-api</FrameworkPathOverride>
    <FrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != '' AND '$(TargetFramework)' == 'net472'">$(BaseFrameworkPathOverrideForMono)/4.7.2-api</FrameworkPathOverride>
    <EnableFrameworkPathOverride Condition="'$(BaseFrameworkPathOverrideForMono)' != ''">true</EnableFrameworkPathOverride>
    <!-- Add the Facades directory.  Not sure how else to do this. Necessary at least for .NET 4.5 -->
    <AssemblySearchPaths Condition="'$(BaseFrameworkPathOverrideForMono)' != ''">$(FrameworkPathOverride)/Facades;$(AssemblySearchPaths)</AssemblySearchPaths>
    </PropertyGroup>
  4. For projects that have lived through C# evolution from C# 5 to C# 7: You may need to remove duplicate references to e.g. System.ValueTuple. Add Condition="'$(OS)' != 'Unix'" to the reference. This applies to Types that MS put on NuGet.org during the evolution. idk why msbuild builds without complain on Windows but not on Unices.
    Example:

    <Reference Condition="'$(OS)' != 'Unix'" Include="System.ValueTuple, Version=4.0.1.1, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
    <HintPath>..\packages\System.ValueTuple.4.3.1\lib\netstandard1.0\System.ValueTuple.dll</HintPath>
    </Reference>
  5. For References to Microsoft.VisualStudio.TestTools.UnitTesting: Add a nuget reference to MSTEST V2 from nuget.org and make it conditional on the OS

    <ItemGroup Condition="'$(OS)' == 'Unix'">
    <Reference Include="MSTest.TestFramework" Version="2.1.1">
    <HintPath>..\packages\MSTest.TestFramework.2.1.2\lib\net45\Microsoft.VisualStudio.TestPlatform.TestFramework.dll</HintPath>
    </Reference>
    <Reference Include="coverlet.collector" Version="1.3.0" >
    <HintPath>..\packages\coverlet.collector.1.3.0\build\netstandard1.0\coverlet.collector.dll</HintPath>
    </Reference>
    </ItemGroup>

    Note this will only get you to a successful build. To run the tests on unix you then have to download and build https://github.com/microsoft/vstest and run it with e.g.
    mono ~/Source/Repos/vstest/artifacts/Debug/net451/ubuntu.18.04-x64/vstest.console.exe --TestAdapterPath:~/Source/Repos/vstest/test/Microsoft.TestPlatform.Common.UnitTests/bin/Debug/net451/ MyTestUnitTestProjectName.dll.

  6. Case Sensitivity & mis-cased references
    Windows programmers are used to a case-insensitive filesystem. So if code or config contains references to files, you may need to correct mismatched casing. Usually a 'FileNotFoundException' will tell you if you have this problem.

  7. The Registry, and other Permissions
    See this post for more: https://www.cafe-encounter.net/p1510/asp-net-mvc4-net-framework-version-4-5-c-razor-template-for-mono-on-mac-and-linux

Using the command line

It is helpful to be somewhat familiar with microsoft docs on MSBuild Concepts since msbuild will be issuing most of your build errors. If you have installed mono then you can run msbuild from the command line with extra diagnostics e.g.

msbuild -v:d >> build.log

you can also run web applications from the command just by running

xsp

from the project directory.

Original Text from 2011

For reasons best not examined too closely I switch between between Mac and PC which, since I earn my crust largely with .Net development, means switching between Visual Studio and MS.Net and MonoDevelop with Mono.

Mono is very impressive, it is not at all a half hearted effort, and it does some stuff that MS haven't done. But when switching environments, there's always the occasional gotcha. Here are some that have got me, and some solutions.

  • Gotcha: Linq Expressions don't work on Mono?
    Solution: Add a reference to System.Core to your project.
  • Question: What version of NUnit is built in to mono?As of Feb 2011, it's nunit 2.4.8.

fish shell quickstart for converting bash scripts

After some years of bash and PowerShell, and some hours of using fish, I've realised that expansion & predictive typeahead are good features in a shell, whereas “be a great programming language” is less important than I thought: because there is no need to write scripts in the language of your shell.

Fish has slicker typeahead and expansions than bash or even PowerShell. But to switch to a fish shell, you do still have to convert your profile & start-up scripts. So here's my quick-start guide for converting bash to fish.

  • Do this first: at the fish prompt type help. Behold! the fish documentation in your browser is much easier to search than man pages are.
  • Calmly accept that fish uses set var value instead of var=value. Roll your eyes if it helps.
  • Use end everywhere that bash has fi, done, esac, braces {} etc. e.g. function definition is done with function ... end. The keywords do and then are redundant everywhere, just remove them. else has a semicolon after it. case requires a leading switch(expr).
  • There is no [[ condition ]] but [ ... ] or test ... work. Type help test to see all the file and numeric tests you expect, such as if [ -f filename ] etc. string and regex conditionals are done with the string match command (see below). You can replace [[ -f this && -z that || -z other ]] with [ -f this -a -z that -o -z other ] but see below for how fish can also replace || and && constructions with or and and statements.
  • But first! type help string to see the marvels of proper built-in string commands.
  • Replace function parameters $*, $1, $2 etc with $argv, $argv[1], $argv[2] etc. If that makes you scowl, then type help argparse. See! That's much better than kludging about in bash.
  • Remove the $ from $(subcommand) leaving just (subcommand). Inside quotes, take the subcommand outside the quote: "Today is $(date)" becomes "Today is "(date). (Recall that quotes in bash & fish don't work at all like quotes in most programming languages. Quote marks are not token delimiters and a"bc"d is a valid single token and is parsed identically to each of abcd , "abcd", abc'd').
  • Replace heredocs with multi-line literal strings and standard piping syntax. However, note that if you pipe or read to a variable, the default multiline behaviour is to split on newline and generate an array. Defeat this by piping through string split0 – see https://fishshell.com/docs/current/index.html#command-substitution

Search-and-replace Script Snippets

Here is my hit-list of things to search and replace to convert a bash shell to fish. These resolved almost all of my issues in converting a few hundred lines of bash script to fish.

FromToNotes
var=valueset var value
export var=valueset -x var value
export -f functionnameredundant.Just remove it
alias abbr='commandstring'(no change)alias syntax is accepted as an abbreviation for a function definition since fish 3
command $(subshell commmand)
command `subshell commmand`
command (subshell command)
OR
command (subshell commmand | string split0)
Just remove the $ but keep the ()

See below for when you want to add string split0
command "$(subshell commmand)"command (subshell command)Remove both the $ and the quotes ""to make this work
if [[ condition ]] ; then this ; else that ; fiif [ condition ] ; this ; else ; that ; endSee below for more on Fish's multine and and or syntax.
if [[ number != number ]] ; then this ; else that ; fiif [ number -ne number ] ; this ; else ; that ; endSee below for more on Fish's multine and and or syntax.
while condition ; do something ; donewhile condition ; something ; end
$*$argv
$1, $2$argv[1], $argv[2]But see help argparse
if [[ testthis =~ substring ]] if string match -q '*substring*' testthisstring match without -r does glob style testing
if [[ testthis =~ regexpattern ]] if string match -rq regexpattern testthisstring match with -r does regex testing
[ guardcondition ] && command
[ guardcondition ] || command
works as isBut see or and and below for when it's more complex
var=${this:-$that}if set -q this ; set var $this ; else ; set var $that ; end
cat > outfile <<< "heredoc"
cat > outfile <<< "multiline … heredoc"
echo "multiline … heredoc" | cat > outfile no heredocs, but multiline strings are fine
NB printf is better than echo for anything complicated, in any shell.
if [[ -z $this && $that=~$pattern ]]if [ -z $this ] ; and string match -rq $pattern $that ;
content=$(curl $url)set content (curl $url | string split0)without the pipe to string split0, content will be split on newlines to an array of lines.

Fish's multine and and or syntax

Fish has a multiline and and or syntax that may be clearer than && and || in both conditionals and guarded commands. It is less terse.

[ condition ]
and do this
or do that

That said, && and || are still valid in commands :

[ condition ] && do this || do that

Other gotchas

  • You may have to read up on how fish does parameter expansion, and especially handling spaces, differently to bash.
  • Pipe & subcommand output to multiline strings or arrays: set x (cat myfile.txt) will set x to an array of the lines of myfile.txt. To keep x as a single multine string, use string split0 : set x (cat myfile.txt | string split0)

Official tips for new fishers:

See the FAQ at https://fishshell.com/docs/3.0/faq.html

Use NSSM to install SyncThing as a Windows service

SyncThing does what OneDrive & Google Drive can do but under your control, across your machines, with more options, and without having to touch a 3rd party data snooping provider and without having to pay 3rd party Terabyte rates. I use it on my home network both to synchronise configuration across multiple machines and as an at-home backup solution. It's fast, simple, well-maintained and it works.

NSSM is “the Non-Sucking Service Manager” which has a simple GUI to set up commandline programs like SyncThing as a Windows Service.

Install SyncThing

To use SyncThing as a Service, avoid the GUI options such as SyncTrayzor and go for the GitHub download. Choose a directory to install to, such as your Program Files directory.

SyncTrayzor is great for your working machine, where you only need SyncThing to run when you are logged in. For a server which is hosting backups and redundant copies of your files, you want a Windows service running whenever the machine is up.

Install NSSM

NSSM also has no installer as of early 2020. Download & extract to a Program Files directory.

I then added New-Alias nssm "C:\Program Files\nssm-2\win64\nssm.exe" to my PowerShell profile

Launch NSSM

nssm without parameters will show you the commands you can use. The simplest is to use install & edit to get the GUI:
To show service installation GUI: nssm install [<servicename>]
To show service editing GUI: nssm edit <servicename>

So use:

nssm install SyncThing

And then fill in the boxes by finding the path where you installed SyncThing. I only edited the first three tabs: Application, Details, and Log On. The rest can stay as default.

What about the Parameters? See the SyncThing Docs. This is mine:

-no-console -no-browser -no-restart -gui-address=localhost:8384

-no-console -no-browser are because services run headless.
-no-restart because the Windows Service infrastructure has options for handling restarts.
-gui-address=localhost:8384 to make the gui console only available on localhost, not across the network. You may not want this.

You can now use nssm to start/stop/monitor services, not just the ones you have installed with it.

nssm start SyncThing
nssm status SyncThing

Or, you can use the standard Windows Services gui.

Where is the config?

Nssm just edits the Windows service config, which is visible in the Local Services app, which you can launch from Task Manager -> Services

SyncThing keeps config in the place noted in SyncThing Docs unless you add e.g. -home=D:\MyPath to the startup parameters

Where is the SyncThing Gui?

If you followed my example and used -gui-address=localhost:8384 then open that address in your browser and read all about at https://docs.syncthing.net/intro/gui.html

More Options?

See https://docs.syncthing.net/users/syncthing.html

Yes but I want to manage it across my home network?

  1. Change the startup options to use -gui-address=0.0.0.0:8384.
  2. Add the full path to SyncThing.exe as a firewall exception in your Windows firewall.
  3. Restart the service

This will make the browser interface accessible across the network. Then:

  1. Open the the GUI at localhost:8384.
  2. Open the Settings (under the Actions menu, top right).
  3. Open the GUI panel.
    1. Choose HTTPS
    2. Add a username and password. NB I think these are both case sensitive.

Bash and PowerShell in a single script file

I'm not saying it's all dotnet’s fault, but it was when deploying dotnetcore services to a linux VM that I thought, “what I really, really want is both bash and powershell setup scripts in a single file”. Surely a working incantation can be crafted from such arcane systems of quoting and escaping as the two languages offer?

½ an evening later :

# This file has a bash section followed by a powershell section,
# and a shared section at the end.
echo @'
' > /dev/null
#vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
# Bash Start --------------------------------------------------

scriptdir="`dirname "${BASH_SOURCE[0]}"`";
echo BASH. Script is running from $scriptdir

# Bash End ----------------------------------------------------
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
echo > /dev/null <<"out-null" ###
'@ | out-null
#vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
# Powershell Start --------------------------------------------

$scriptdir=$PSScriptRoot
"powershell. Script is running from $scriptdir"

# Powershell End ----------------------------------------------
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
out-null

echo "Some lines work in both bash and powershell. Calculating scriptdir=$scriptdir, requires separate sections."

It relies on herestring quoting being different for each platform, as is the escape character ( \ vs ` ). Readibility (ha!) is very much helped by

#comments begin with a hash 

being common to both, so I can do visible dividers between the sections.

My main goal was environment variable setup before launching dotnetcore services. Sadly the incompatible syntaxes for variables and environment:

#powershell syntax
$variable="value"
$env:variable2=$value
#bash syntax
variable=value
export variable2=value 

means very little shared code inside the file, but it really cut down errors a lot just by having them in the same file. Almost-a-single-source-of-truth turned out to be much more reliable than not-at-all a single source of truth.

Bash-then-powershell was simpler than Powershell-then-bash. My state-of-the art is powershell named and validated parameters, which allows tab-completion to work in powershell.

` # \
# PowerShell Param
# every line must end in #\ except last line must end in <#\
# And, you can't use backticks in this section        #\
param( [ValidateSet('A','B')]$tabCompletionWorksHere, #\
       [switch]$andHere                               #\
     )                                               <#\
#^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ `

Repo: github.com/chrisfcarroll/PowerShell-Bash-Dual-Script-Templates

Raw: Powershell-or-bash-with-parameters .

Alternatively, do everything in powershell?

Of course, sensible people would do everything in a single scripting language. But it has been well-worth having the tools for both approaches. Especially for short bootstrap scripts.

For a powershell core everywhere approach, my main adaptation is the shebang header on all .ps1 files:

#! /usr/bin/env pwsh

which tells unix machines to what kind of script it is. Powershell itself ignores it as a comment. Finally, you must also chmod a+x *.ps1 to mark them as executable.