fish shell quickstart for converting bash scripts

After some years of bash and PowerShell, and some hours of using fish, I’ve realised that expansion & predictive typeahead are good features in a shell, whereas “be a great programming language” is less important than I thought: because there is no need to write scripts in the language of your shell.

Fish has slicker typeahead and expansions than bash or even PowerShell. But to switch to a fish shell, you do still have to convert your profile & start-up scripts. So here’s my quick-start guide for converting bash to fish.

  • Do this first: at the fish prompt type help. Behold! the fish documentation in your browser is much easier to search than man pages are.
  • Calmly accept that fish uses set var value instead of var=value. Roll your eyes if it helps.
  • Use end everywhere that bash has fi, done, esac, braces {} etc. e.g. function definition is done with function ... end. The keywords do and then are redundant everywhere, just remove them. else has a semicolon after it. case requires a leading switch(expr).
  • There is no [[ condition ]] but [ ... ] or test ... work. Type help test to see all the file and numeric tests you expect, such as if [ -f filename ] etc. string and regex conditionals are done with the string match command (see below). You can replace [[ -f this && -z that || -z other ]] with [ -f this -a -z that -o -z other ] but see below for how fish can also replace || and && constructions with or and and statements.
  • But first! type help string to see the marvels of proper built-in string commands.
  • Replace function parameters $*, $1, $2 etc with $argv, $argv[1], $argv[2] etc. If that makes you scowl, then type help argparse. See! That’s much better than kludging about in bash.
  • Remove the $ from $(subcommand) leaving just (subcommand). Inside quotes, take the subcommand outside the quote: "Today is $(date)" becomes "Today is "(date). (Recall that quotes in bash & fish don’t work at all like quotes in most programming languages. Quote marks are not token delimiters and a"bc"d is a valid single token and is parsed identically to each of abcd , "abcd", abc'd').
  • Replace heredocs with multi-line literal strings and standard piping syntax. However, note that if you pipe or read to a variable, the default multiline behaviour is to split on newline and generate an array. Defeat this by piping through string split0 – see https://fishshell.com/docs/current/index.html#command-substitution

Search-and-replace Script Snippets

Here is my hit-list of things to search and replace to convert a bash shell to fish. These resolved almost all of my issues in converting a few hundred lines of bash script to fish.

FromToNotes
var=valueset var value
export var=valueset -x var value
export -f functionnameredundant.Just remove it
alias abbr=’commandstring’(no change)alias syntax is accepted as an abbreviation for a function definition since fish 3
command $(subshell commmand)
command `subshell commmand`
command (subshell command)
OR
command (subshell commmand | string split0)
Just remove the $ but keep the ()

See below for when you want to add string split0
command “$(subshell commmand)”command (subshell command)Remove both the $ and the quotes ""to make this work
if [[ condition ]] ; then this ; else that ; fiif [ condition ] ; this ; else ; that ; endSee below for more on Fish’s multine and and or syntax.
if [[ number != number ]] ; then this ; else that ; fiif [ number -ne number ] ; this ; else ; that ; endSee below for more on Fish’s multine and and or syntax.
while condition ; do something ; donewhile condition ; something ; end
$*$argv
$1, $2$argv[1], $argv[2]But see help argparse
if [[ testthis =~ substring ]] if string match -q ‘*substring*’ testthisstring match without -r does glob style testing
if [[ testthis =~ regexpattern ]] if string match -rq regexpattern testthisstring match with -r does regex testing
[ guardcondition ] && command
[ guardcondition ] || command
works as isBut see or and and below for when it’s more complex
var=${this:-$that}if set -q this ; set var $this ; else ; set var $that ; end
cat > outfile <<< “heredoc”
cat > outfile <<< “multiline … heredoc”
echo “multiline … heredoc” | cat > outfile no heredocs, but multiline strings are fine
NB printf is better than echo for anything complicated, in any shell.
if [[ -z $this && $that=~$pattern ]]if [ -z $this ] ; and string match -rq $pattern $that ;
content=$(curl $url)set content (curl $url | string split0)without the pipe to string split0, content will be split on newlines to an array of lines.

Fish’s multine and and or syntax

Fish has a multiline and and or syntax that may be clearer than && and || in both conditionals and guarded commands. It is less terse.

[ condition ]
and do this
or do that

That said, && and || are still valid in commands :

[ condition ] && do this || do that

Other gotchas

  • You may have to read up on how fish does parameter expansion, and especially handling spaces, differently to bash.
  • Pipe & subcommand output to multiline strings or arrays: set x (cat myfile.txt) will set x to an array of the lines of myfile.txt. To keep x as a single multine string, use string split0 : set x (cat myfile.txt | string split0)

Official tips for new fishers:

See the FAQ at https://fishshell.com/docs/3.0/faq.html

Use NSSM to install SyncThing as a Windows service

SyncThing does what OneDrive & Google Drive can do but under your control, across your machines, with more options, and without having to touch a 3rd party data snooping provider and without having to pay 3rd party Terabyte rates. I use it on my home network both to synchronise configuration across multiple machines and as an at-home backup solution. It’s fast, simple, well-maintained and it works.

NSSM is “the Non-Sucking Service Manager” which has a simple GUI to set up commandline programs like SyncThing as a Windows Service.

Install SyncThing

To use SyncThing as a Service, avoid the GUI options such as SyncTrayzor and go for the GitHub download. Choose a directory to install to, such as your Program Files directory.

SyncTrayzor is great for your working machine, where you only need SyncThing to run when you are logged in. For a server which is hosting backups and redundant copies of your files, you want a Windows service running whenever the machine is up.

Install NSSM

NSSM also has no installer as of early 2020. Download & extract to a Program Files directory.

I then added New-Alias nssm "C:\Program Files\nssm-2\win64\nssm.exe" to my PowerShell profile

Launch NSSM

nssm without parameters will show you the commands you can use. The simplest is to use install & edit to get the GUI:
To show service installation GUI: nssm install [<servicename>]
To show service editing GUI: nssm edit <servicename>

So use:

nssm install SyncThing

And then fill in the boxes by finding the path where you installed SyncThing. I only edited the first three tabs: Application, Details, and Log On. The rest can stay as default.

What about the Parameters? See the SyncThing Docs. This is mine:

-no-console -no-browser -no-restart -gui-address=localhost:8384

-no-console -no-browser are because services run headless.
-no-restart because the Windows Service infrastructure has options for handling restarts.
-gui-address=localhost:8384 to make the gui console only available on localhost, not across the network. You may not want this.

You can now use nssm to start/stop/monitor services, not just the ones you have installed with it.

nssm start SyncThing
nssm status SyncThing

Or, you can use the standard Windows Services gui.

Where is the config?

Nssm just edits the Windows service config, which is visible in the Local Services app, which you can launch from Task Manager -> Services

SyncThing keeps config in the place noted in SyncThing Docs unless you add e.g. -home=D:\MyPath to the startup parameters

Where is the SyncThing Gui?

If you followed my example and used -gui-address=localhost:8384 then open that address in your browser and read all about at https://docs.syncthing.net/intro/gui.html

More Options?

See https://docs.syncthing.net/users/syncthing.html

Yes but I want to manage it across my home network?

  1. Change the startup options to use -gui-address=0.0.0.0:8384.
  2. Add the full path to SyncThing.exe as a firewall exception in your Windows firewall.
  3. Restart the service

This will make the browser interface accessible across the network. Then:

  1. Open the the GUI at localhost:8384.
  2. Open the Settings (under the Actions menu, top right).
  3. Open the GUI panel.
    1. Choose HTTPS
    2. Add a username and password. NB I think these are both case sensitive.

Bash and PowerShell in a single script file

I’m not saying it’s all dotnet’s fault, but it was when deploying dotnetcore services to a linux VM that I thought, “what I really, really want is both bash and powershell setup scripts in a single file”. Surely a working incantation can be crafted from such arcane systems of quoting and escaping as the two languages offer?

½ an evening later :

# This file has a bash section followed by a powershell section,
# and a shared section at the end.
echo @'
' > /dev/null
#vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
# Bash Start --------------------------------------------------

scriptdir="`dirname "${BASH_SOURCE[0]}"`";
echo BASH. Script is running from $scriptdir

# Bash End ----------------------------------------------------
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
echo > /dev/null <<"out-null" ###
'@ | out-null
#vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv
# Powershell Start --------------------------------------------

$scriptdir=$PSScriptRoot
"powershell. Script is running from $scriptdir"

# Powershell End ----------------------------------------------
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
out-null

echo "Some lines work in both bash and powershell. Calculating scriptdir=$scriptdir, requires separate sections."

It relies on herestring quoting being different for each platform, as is the escape character ( \ vs ` ). Readibility (ha!) is very much helped by

#comments begin with a hash 

being common to both, so I can do visible dividers between the sections.

My main goal was environment variable setup before launching dotnetcore services. Sadly the incompatible syntaxes for variables and environment:

#powershell syntax
$variable="value"
$env:variable2=$value
#bash syntax
variable=value
export variable2=value 

means very little shared code inside the file, but it really cut down errors a lot just by having them in the same file. Almost-a-single-source-of-truth turned out to be much more reliable than not-at-all a single source of truth.

Bash-then-powershell was simpler than Powershell-then-bash. My state-of-the art is powershell named and validated parameters, which allows tab-completion to work in powershell.

` # \
# PowerShell Param
# every line must end in #\ except last line must end in <#\
# And, you can't use backticks in this section        #\
param( [ValidateSet('A','B')]$tabCompletionWorksHere, #\
       [switch]$andHere                               #\
     )                                               <#\
#^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ `

Repo: github.com/chrisfcarroll/PowerShell-Bash-Dual-Script-Templates

Raw: Powershell-or-bash-with-parameters .

Alternatively, do everything in powershell?

Of course, sensible people would do everything in a single scripting language. But it has been well-worth having the tools for both approaches. Especially for short bootstrap scripts.

For a powershell core everywhere approach, my main adaptation is the shebang header on all .ps1 files:

#! /usr/bin/env pwsh

which tells unix machines to what kind of script it is. Powershell itself ignores it as a comment. Finally, you must also chmod a+x *.ps1 to mark them as executable.

Migrating Net Framework to Netcore

Until NetCore 2 came along, migrating an existing Net Framework project to dotnet core was likely a painful exercise in futility, as you time-consumingly discovered just how many bits of the .Net framework don’t exist on netcore 1. Small things, like key parts of AdoNet. It was a bleeding edge experience.

But then there was dotnet core 2 with not-very-far-off 100% Api compatibility. And now all is sweetness and light.

Seriously. It is. Huge chunks of your .Net framework project will now ‘just work’ on .netcore, with little or no editing. In fact it’s so easy, you might consider multi-targeting net framework and netcore just to show off.

Console apps and class libraries are straightforward. Considerations for UI and platform technologies:

  • For AspNet, there is a learning curve from Mvc versions 3/4/5 to AspNetCore for which I will refer you to the tutorials. There is then work to do which I do cover below.
  • For Windows Forms and WPF projects, I recommend you to the considerations in MS guide for porting Winforms.
  • WWF and WCF-serverside are not (currently) migratible, although WCF-clientside is. Moving Web-facing WCF-serverside to dotnetcore with NancyFx or to AspNetCore would be a smallish-rewrite, about the same as moving to WebApi2; but it seems that MS are working on WCF serverside. Remoting, however is gone. So if your preference for WCF is that it’s a better style that all this pseudo-resty AspNet nonsense, then consider Nancy.

Overview

  1. Start with a new, empty dotnet core 2 project
  2. Drag-n-drop all your existing code into it, excluding AssemblyInfo.cs
  3. Deal with .settings and .config files
  4. Re-add your NuGet dependencies
  5. Deal with other code differences
  6. Build and Go!

Well okay, that last step, Build-and-Go, is more likely to be Build-and-Fix-The-Next-Compiler-Error-And-Build-Again. But it is mostly straightforward.

To migrate AspNet to AspNetCore there are more steps, and you do have to start with the learning curve for a whole new framework. That said, it’s like someone thought, “let’s redo Mvc as WebApi2 + Razor + Views but with a cleaner startup style and with mandatory dependency injection”. Your controllers will hardly change. I do find AspNetCore simpler, cleaner, easier to work with. Roughly, your steps are:

  1. Work through the getting started tutorials & learning curve. (Estimate 5-10 hours per person?)
  2. Migrate your startup config to the new approach (2-10 hours depending on how much novel startup code you have)
  3. Migrate any custom authentication to the new approach (An hour or so if you read the gotcha below)
  4. Consider whether your Attribute-based filters will remain as attributes or be re-worked into something else.
  5. Re-tool unit tests which mocked the old Asp.Net Mvc dependencies

Larger sets of projects

If you are dealing with not just a single project but a whole load of them, you should first look through Microsoft’s guide to porting. The main reason to not work through those steps for a single project is that since netcore2, the fastest way to analyse “what problems will I have in porting” for a smallish project is to just do it! You can most likely finish the job already, faster than you can use the analysis tools to predict what problems you will have. That wasn’t always the case before netcore 2. A couple of thoughts from that guide that I do recommend though:

Start with a new empty dotnet core 2 project.

To migrate an executable you’d create a console app. For a class library, you can make it a netstandard2 project, which makes the project available for use in .net 4.6.1+ / 4.7 as well as in dotnet core.

The command line is very trendy in dotnet core, so you can do it all with dotnet new instead of using a GUI. dotnet new will show what templates are installed on your machine.

Drag-n-drop all your existing code in, excluding AssemblyInfo.cs

dotnet core projects assume, by default, that if there’s a code file in the directory or a subdirectory then it’s part of the project, so just dragging all your code into the new project directory will just work.

Don’t include the AssemblyInfo.cs because that gets auto-generated from the .csproj file. If you have anything of interest in your AssemblyInfo.cs, edit the .csproj file and put it in there. The AssemblyInfo properties section of Additions to the csproj properties for dotnet core show you the Element Names to use if you want to re-add information. Something like:

<PropertyGroup>
<TargetFrameworks>netstandard1.6;net40</TargetFrameworks>
<AssemblyVersion>4.1.4.3</AssemblyVersion>
<AssemblyFileVersion>4.1.4.3</AssemblyFileVersion>
<PackageVersion>4.1.4.3</PackageVersion>
<GenerateDocumentationFile>true</GenerateDocumentationFile>
<Title>TestBase – Rich, fluent assertions and tools for testing with heavyweight dependencies: AspNetCore, AdoNet, HttpClient, AspNet.Mvc, Streams, Logging</Title>
<PackageDescription><![CDATA[*TestBase* gives you a flying start with ....etc...</PackageDescription>

Note the new properties with names beginning with <Package...> which will be picked up by dotnet pack when creating NuGet Packages. Nuget is much easier with dotnet core, it’s kind of built-in instead of being an extra thing to learn and do.

Deal with .settings and .config files

There is a whole new approach to settings and configuration. You will have to learn it. It’s good though. It lets you do things like this:

{
"AComponentDefaults": {
"SomeSetting": "Me",
"ANumericSetting" : 1.0,
"Subsetting": {
"Something" : "Sub"
},

"JustOneLine" : "This"
}

and then read a whole section as strongly-typed settings with a one-liner:

Configuration.GetSection("AComponentDefaults")
.Bind(myComponent = new AComponent());

You can still use single-line settings of course:

var justOneLine=Configuration["JustOneLine"]

The new system deals easily with per-environment overrides, and has a whole new “get your config from all kinds of other sources than the settings file” capability.

Re-add your NuGet dependencies

This is straightforward. In Visual Studio (or in JetBrains Rider) use right-click -> Manage NuGet Packages. On the command line it’s dotnet add package.

The big news here is that most of your NuGet dependencies already work on dotnetcore. All of the most downloaded NuGet packages are either multi-targeted or have packages for each platform. (The dependency trees for most packages for dotnet core is quite different to the dependency tree for net framework, but it makes no difference at all, on the whole).

Deal with other code differences

I don’t think there are too many. Under netcore2, your major external dependencies – AdoNet, HttpClient and FileSystem – are all either the same or quite similar. SqlClient, Npgsql, Dapper are pretty much unchanged and the rest of the Framework is very much the same.
Main code changes:

  • Scan the list of breaking changes, which are largely in low-level or platform specific areas.
  • If you use reflection you must often use type.GetTypeInfo().GetXXX() instead of type.GetXXX(). If you’re good with regex, this just needs a search-&-replace to fix.
  • EntityFramework Core is different, but not extremely different.

Build and Go!

And … deploy to Macos and Linux. Hurray.

Migrating AspNet to AspNetCore

Work through the getting started tutorials

Really. Don’t try to skip the aspnetcore getting started learning curve. Be aware that the tutorials push the new Razor Pages approach, which you will want to ignore. Instead be sure you’re clear on how the new approach handles startup, dependency injection, attributes, filters, and authentication. Your controllers and routing will largely work with minimal change.

Migrate your startup config to the new approach

So having done your learning curve, you understand that all your Global.asax.cs and App_Startup code will move into, or be called from, your Startup class. And you will cleanly separate config setup—having learned about the new configuration approach–and you will use a dependency injection container to provide any global config to your controllers.

Fix-up ControllerContext changes

There are some fiddling tidyup changes on ControllerContext and Request properties–for instance userHostAddress is no more, you must look for HttpContext.Connection.RemoteIpAddress instead. Global HttpContext is gone, but of course you were always careful to use controller.HttpContext weren’t you?

Add the new interfaces to Attribute-based filters, or else rework them as middleware

You do need to learn about the new kinds of filters, and consider whether what you are doing with your filters should stay as-is in attribute filters or might it be simpler to move logic into the new middleware approach.

Migrate any custom authentication/authorization to the new approach

The mistake to avoid here, is trying to make your custom AuthorizationAttribute work as an AspNetCore attribute. Don’t. Instead,

  • Move the logic of your custom AuthorizationAttribute into a Policy, which could be just a single method call.
  • Delete your custom attribute and let the built-in AuthorizeAttribute reference your new policy:
    [Authorize(Policy="MyCustomPolicy")]

It would have saved me half a day if I’d realised this up-front. But on this plan, you can migrate custom authenticate in an hour or even minutes.

Re-tool your unit test controller dependencies for the new framework

There is some popular code on the web for mocking the dependencies of an Mvc 3 or 4 or 5 Controller.ControllerContext. This must all be replaced.

Myself, for Mvc 4 & 5 I always used TestBase-Mvc which gave me two simple extension methods:

var unitUnderTest= new MyMvcController(...)
.WithHttpContextAndRoutes();

var webApiControllerUnderTest= new MyWebApiController(...)
.WithWebApiHttpContext<T>(HttpMethod httpMethod,
[Optional] string requestUri,
[Optional] string routeTemplate);

//Or, optional parameters to process the actual route urls from your RegisterRoutes config:

controllerUnderTest
.WithHttpContextAndRoutes(
[Optional] Action<RouteCollection> mvcApplicationRoutesRegistration,
[optional] string requestUrl,
[Optional] string query = "",
[Optional] string appVirtualPath = "/",
[Optional] HttpApplication applicationInstance)

This makes sure a controller can reference cookies, session, TempData, the Url.Action() calls and even the global HttpContext.Current in a unit test context.

For AspNetCore, I wrote TestBase.Mvc.AspNetCore (soon to be renamed to to TestBase.AspNetCore) which offers a similar thing:

var uut = new ControllerUnderTest().WithControllerContext();
uut.Url.Action("a", "b").ShouldEqual("/b/a");
uut.ControllerContext.ShouldNotBeNull();
uut.HttpContext.ShouldBe(uut.ControllerContext.HttpContext);
uut.Request.ShouldNotBeNull();
uut.ViewData.ShouldNotBeNull();
uut.TempData.ShouldNotBeNull();
uut.MyAction(param)
.ShouldBeViewResult()
.ShouldHaveModel<YouSaidViewModel>()
.YouSaid.ShouldBe(param);

It also has a large set of fluent assertions for ViewResults, FileResults, etc, etc. Once I’d written the new infrastructure, migrating my controller unit tests was mostly painless. (Nb it still needs a few changes for CompatibityVersion_2_2, it’s currently written for 2.0.)

New in AspNetCore is the ease of testing not just individual controllers but the whole hosted application. The AspNetCore team coded a TestServer for their unit tests, and this server can be used, bootstrapped with your actual application’s Startup code, and then tested with an HttpClient:

[TestFixture]
public class WhenTestingControllersUsingAspNetCoreTestTestServer : HostedMvcTestFixtureBase
{

[TestCase(""/dummy/action?id={id}"")]
public async Task Get_Should_ReturnActionResult(string url)
{
var id=Guid.NewGuid();
var httpClient=GivenClientForRunningServer<Startup>();
GivenRequestHeaders(httpClient, ""CustomHeader"", ""HeaderValue1"");

var result= await httpClient.GetAsync(url.Formatz(new {id}));

result
.ShouldBe_200Ok()
.Content.ReadAsStringAsync().Result
.ShouldBe(""Content"");
}

But I have come round to seeing this as automated integration testing, not unit testing. I would use it for testing e.g. content negotiation is working as expected, not for testing the domain logic of a controller action.

Conclusion

Since the arrival of netcore2, the cost of migrating to DotNetCore is dramatically lower. DotNetCore tooling and extensibility is very good. Even migrating AspNet is not an excessive task. Even for just NetFramework 4 development, the new tooling is simpler and better. I reckon that dotnetcore is cheaper and easier to write and maintain. Both C# and the framework are evolving in ways that reduce your cost of development: And you get cross-platform deployment pretty much for free. At last.