IceFire Studios

IceFire Blog

Tips on using SharePoint in a multilingual environment

Simple Trick to Translate the SharePoint Hub Name

SharePoint's hub navigation supports localization, that is to say if you change your language then you can see the navigation links in that language... eventually.  The navigation links are cached so it may take a while for people to see the translated versions.

That applies to the text of the links, because they support the MUI (Multilingual User Interface) where the text of the user interface element can be different for different languages, but it does not work for the "Hub name" which appears in the global navigation menu.  It is the same in all languages.

As you can see above. the navigation is localized, but the name of the hub, "Global Navigation", is not.

Fortunately, there is a simple workaround for this.  The first step is to hide the Hub name.  Go to the settings (gear icon) then Hub site settings, and select "Hidden in navigation"


The hub name will disappear from the navigation bar.  Next, we will add it back as a label.  Edit the navigation menu and create a new item, and make this item a Label, and give it the same name as the Hub name.  For the purpose of demonstration, I used a lower case for the word "navigation" so that I can tell that it is the label and not the original name.


Make sure it is the first item in the menu and save.

This label does support MUI.  You can localize it.  I used PointFire 365 to do this, but you can also do it manually.



The end result looks exactly like the hub menu, showing the hub name, except that the Hub name is localized.

Inuktitut machine translation is here. Why that's a big deal.

Azure Translator Text now supports the Inuktituk language spoken in the Inuit area in the far north of North America.

Over the years, I've received a lot of requests to provide machine translation for Inuktitut.  Despite the tools that are available, particularly from Microsoft, to train your own neural translation engine for an unsupported language using a corpus of translated documents, and a great bilingual corpus from the debates of the Nunanut legislative assembly, I knew that this would not be possible.  Other machine translation experts also agreed that it was beyond the state of the art.  On a couple of occasions I had applied for funding to push beyond the state of the art to make this possible, unsuccessfully.  Why is machine translation of Inuktitut so difficult?


Inuktitut belongs to a class called "polysynthetic languages".  Most of the languages that you know are probably "agglutinative"  There are some root words which can be modified by changing the beginning or the end of the word.  The root word is in the dictionary, but the words that are modified by adding or changing suffixes and prefixes are typically not, because eveyone knows the rules.  These agglutinative languages are part of a larger class called synthetic languages, which includes other simple rules for sticking words together, usually with a small set of rules that apply to one part of speech.  For example German can stick a lot of known nouns end-to-end to make a new word, but there is one root word and all the other words are modifying or narrowing down the sense one of the words, and the resulting word behaves like a longer version of the base word, and has the same part of speech.

Inuktitut is polysynthetic.  The combination rules are much more complex.  There can be several root concepts and root words, and it can lose its part of speech or change it because the full word is an entire sentence with subject object, verb, adjective, even subordinate clauses, all contained in a big compound word. How you join words together can vary using complex rules about what comes before and after the join.  A well known example is the word "ᖃᖓᑕᓲᒃᑯᕕᒻᒨᕆᐊᖃᓛᖅᑐᖓ" which means "I'll have to go to the airport".  Verbs, nouns, subject, object, they're all contained in the same word.

Not all but most native American languages are polysynthetic.  Unlike other languages, neural networds can't just have a dictionary and some rules and train the translation engine to see patterns of three or four words in a row that always translate to the same 3 or 4 words in another language.  Almost all the neural translation engines I have seen are word-based.  There are languages that are written without spaces, like Chinese, Japanese, Thai, and Korean, but they still have individual words and breaking them up into individual words is relatively simple.  Not so with polysynthetic languages.

I don't see any information about how Microsoft tackled the problem for Inuktitut.  I am assuming that they used a tool to break down words into morphemes.  What I would have used but didn't get funding for was the National Research Council's Uqailaut Inuktitut Morphological Analyzer, but I don't know whether Microsoft did something similar.  I am watching for any publications about it. There have been some advances lately in modeling and translating these languages, so that is not the only approach.

On the other hand, perhaps they trained a neural network to decompose into morphemes and vice-versa without a standalone processor.  If that's the case, then the same techniques could be used for various other widespread but hard to translate polysynthetic languages from the Algonquian language family like Cree and Ojibwe, or Iroquoian languages like Mohawk, or Athabascan langauges like Dene or Navajo, or Siouan languages like Dakotan.  It's a game changer.

Oh, and if you were curious, PointFire Translator now supports translation to and from Inuktitut on SharePoint sites, just use language code "iu".  Your browser should already support Canadian Aboriginal Syllabics.

Script Files That Download Rather Than Executing

It started with an intermittent problem from a client.  In some cases their users were being prompted to download a JavaScript file that had our product's name somewhere in the file name.

That file was a localization file that SPFx uses.  When you create an SPFx webpart or application customizer, there is a "loc" folder in which you put files that contain the character strings that vary according to language.  You can find more details about SPFx localization here. Each file in that folder will have a name corresponding to the locale, for example "en-us.js" and "fr-fr.js".  These files will then be packaged under a longer name, consisting of

(the name in the project)_en-us_(some long hex string).js

It turns out that this file was the one that was sometimes downloaded. Looking more deeply at why it was being downloaded sometimes when other JavaScript files weren't, it turns out that there was a difference in the http headers that were being sent.


The major difference between the .js file that was sometimes being downloaded and the ones that weren't was that the ones being downloaded had a "content-disposition" header and the the others didn't.  In theory, a content-disposition of "attachment" with a file name means that this file should be downloaded, not sent to whatever is responsible for the MIME type represented in the Content-Type header.  In practice, browsers know that web servers often get the http headers wrong, so they look for other evidence of the intent.  Most of the time they look at the content-type and only download the file if its value is "application/octet-stream".  They apparently also look at the context in which is it being called.  

In the case of the client, the download usually came when the user had just logged out and logged back in again.  Their authentication was not the usual one, so it involved some redirection to and from their on premise identity management.  If the caching of this JavaScript file had expired but other files were in the cache, it could very well be the first file being fetched from the SharePoint Online domain, so no context is available for the browser to take into account.  In that case, it seemed, it respected the incorrect Content-Disposition header. Also, they were not using and did not want to use any CDN to serve these files, unlike most of our other clients with a similar architecture.  Files served by a CDN tend to have the correct http headers.

That behaviour is easily reproduced, by going directly to the appropriate folder of the App Catalog's ClientSideAssets library and clicking on the file name.  That file prompted the user to download it, while other .js files in the same folder did not.

I won't bore you with everything that we tried to figure this out.  As some point we were convinced that this happened to with application customizers but did not happen to webparts.  As it turns out, the problem was with our naming convention.  using the words "modern application customizer" in the name rather than "modern webpart" made a difference.

The difference was the name of the JavaScript file, specifically the length of the name.  The file name was already pretty long, but the difference between the words "webpart" and "application customizer" took it over some magic threshold.  Our naming convention sabotaged the attempt to reproduce the problem with a "hello, world" test.  If looks like a file with a name of 81 characters has the correct http header, but one of 87 has the wrong one.  I'm still not sure what is the exact limit, and whether it is a limit on the file name or the path or the full URL.  None of those lengths seem close to some magic number, but it doesn't matter since shortening the part of the file name that we are responsible for fixes the problem.





How to make SharePoint API calls language-independent

SharePoint's API has gone through several iterations, from the 2010 API, to the 2013 API, and now Graph API.  Throughout, it has struggled with being language-independent, particularly when dealing with lists and libraries.  Everything works fine when the site is in English and the user's language is English, but many things don't work like the examples when the API encounters a different language.  If you are writing software, or even a Power Automate flow that uses the API, you need to be conscious of the fact that that the API may stop working if the user is using a different language, or the site was created in a different language.

Let's use a simple problem to illustrate.  Suppose you want to know whether a user is a site administrator.  That information is kept in a handy hidden list on the site, called (in English) the User Information List.  Let's retrieve that information using the SharePoint 2010 API, the one that starts with "_vti_bin/ListData.svc/".  The REST API call would be:

{Site URL}_vti_bin/ListData.svc/UserInformationList(4)/IsSiteAdmin

Here, I am using a "4" to indicate a specific user's ID on this site.  This query returns a single line of xml with (in my case) the value "true".  Simple, right?

Unfortunately, that only works if the user's current UI language is English.  If it's French, that API call fails.  Instead you have to use this call 

_vti_bin/ListData.svc/ListeDInformationsUtilisateur(4)/EstLAdministrateurDuSite

 

This is because the list name and the column name used by the list webservice are localized by the MUI

In Dutch, it's 

_vti_bin/ListData.svc/LijstMetGebruikersgegevens(4)/IsBeheerderVanDeSite

 

In Hindi, 

_vti_bin/ListData.svc/उपयोगकर्ताजानकारीसूची(4)/साइटव्यवस्थापनहै

 

Don't forget to URLencode those Hindi Devanagari characters! 

_vti_bin/ListData.svc/%E0%A4%89%E0%A4%AA%E0%A4%AF%E0%A5%8B%E0%A4%97%E0%A4%95%E0%A4%B0
%E0%A5%8D%E0%A4%A4%E0%A4%BE%E0%A4%9C%E0%A4%BE%E0%A4%A8%E0%A4%95%E0%A4%BE%E0%A4%B0
%E0%A5%80%E0%A4%B8%E0%A5%82%E0%A4%9A%E0%A5%80(4)/%E0%A4%B8%E0%A4%BE%E0%A4%87%E0%A4%9F
%E0%A4%B5%E0%A5%8D%E0%A4%AF%E0%A4%B5%E0%A4%B8%E0%A5%8D%E0%A4%A5%E0%A4%BE%E0%A4%AA
%E0%A4%A8%E0%A4%B9%E0%A5%88

 (line breaks added for formatting)

All of that means that your code has to be ready with 50 different versions of that API call depending on the language of the end user.

It's a little better when you are using the 2013 API.  You still have to give it the list's title, but it's the list's internal name, not the display title for the current language.  To get the list, the 2013 API call is 

/_api/web/lists/getbytitle('<list title>')

 

Better, right?  Unfortunately, the internal name of this and many other standard lists and libraries still depends on the language in which the site has been created.  When you create a new site, whether classic or modern, the site has a base language.  So if the site was created in English, you can get the information with this call 

_api/web/lists/getByTitle('User Information List')/items(4)/IsSiteAdmin

 

That call will work no matter what the user's current language is.  But getByTitle() is still not language-independent.  Using getByTitle is always a bad idea unless you know that no one is creating sites in other languages, but there aren't a lot of other options.  If your site was created in Hindi, the call has to be 

_api/web/lists/getByTitle('%E0%A4%89%E0%A4%AA%E0%A4%AF%E0%A5%8B%E0%A4%97%E0%A4%95
%E0%A4%B0%E0%A5%8D%E0%A4%A4%E0%A4%BE%20%E0%A4%9C%E0%A4%BE%E0%A4%A8%E0%A4%95%E0%A4%BE
%E0%A4%B0%E0%A5%80%20%E0%A4%B8%E0%A5%82%E0%A4%9A%E0%A5%80')/items(4)/IsSiteAdmin

(line breaks added for formatting) 

It's only a little better than the 2010 API.  The API call depends on the site, not the user, and the "IsSiteAdmin" column is not localized.  Why the internal name of a hidden list, something that only programmers will ever see, is localized is anybody's guess.  So compared to the 2010 API, you're not much better off.  You still need to have 50 different versions of the API call, even though it's based on the site language and not the user language.


How do you make the API call independent of language?  One way is to find the list using another characteristic that is not language-dependent. One such characteristic is the EntityTypeName, essentially the template for the list or library.  There will only be one User Information List per site, and its Entity Type is "UserInfo".  There could be several pages libraries, but there typically is only one, similarly for Site Assets and so forth.  So the trick it to find the list using an OData filter like 

/_api/web/lists?$filter=EntityTypeName%20eq%20%27UserInfo%27

 

and use the result to retrieve the list Id, which is a GUID, and then use that GUID to retrieve the information from the list, for example 

_api/web/lists('0040809f-5e97-4920-9068-0b36e2aa4b16')/items(4)/IsSiteAdmin

 

That's two API calls rather than just one, which is unfortunate if you're trying to cut down on queries, but at least you don't have to code 50 different possibilities.

For this specific list, the User Information List, there is also a second solution.  It's based on the fact that this particular list also has a fixed URL that does not include the list name, at "_catalogs/users/detail.aspx".  We use this trick to retrieve the list by URL, using the less well known GetList method.  On my ”/sites/Hi” Hindi site, the API call could be

_api/web/GetList(@listUrl)/?@listUrl=%27%2Fsites%2FHi%2F_catalogs%2Fusers%2Fdetail.aspx%27&FilterField1=ID&FilterValue1=4

 

Neither of those tricks is completely general for all lists and libraries, but I haven't found one that applies in every case.

How about the Graph API?  The Graph API suffers from the same problems as the 2013 API, but does not support the same solutions to those problems.  So for instance after you retrieve the Site ID, you can use it to retrieve the list by name in the same way as for the 2013 API.  If the site is in English, it would be

https://graph.microsoft.com/v1.0/sites/{Site ID}/lists('User Information List')


If it’s in Hindi, you would have to use

https://graph.microsoft.com/v1.0/sites/{Site ID}/lists('उपयोगकर्ता जानकारी सूची')


or you can use

lists?$filter=displayName eq 'उपयोगकर्ता जानकारी सूची'

 

I have not yet succeeded in getting it to filter by EntityTypeName (Graph API  calls it “list/template”) or by URL, or even to find the isSiteAdmin property.  I have only succeeded in collecting an impressive set of OData filter expressions that aren't supported, so I'm afraid I don't yet have a good solution for the Graph API.  Maybe someone more clever than me will find one.

SharePoint Conferences in October - See You There!


Photo by Samuel Pereira on Unsplash


October is a busy month for SharePoint lovers. While much awaited in person events had to be postponed due to COVID-19, the virtual conference world is taking over, and October seems to be an especially active month.


The PointFire team is getting all geared up for the 10+ conferences we’re sponsoring, and Martin Laplante will be giving quite a few sessions you might want to attend. The good news? A lot of them are free of charge and you can get access to very interesting learning sessions.


We’ve compiled a breakdown of the conferences we’re attending as well as sessions we're giving and linked them to their respective sites so that you can easily look into the details and, if you haven’t done so already, get your tickets.



10 October: CollabDays - Lisbon

Session: Multilingual Features of SharePoint: What's New, What's Missing


17 October: M365 Saturday - Ottawa

Session: The New Multilingual Page Publishing Feature and Beyond

Schedule will be announced here


16,17 October: CollabDays - Barcelona

Session: Multilingual Features of SharePoint: What's New, What's Missing


26 October: CollabSummit Learning Day

This is an especially interesting half-day learning session. Register early as attendance is free or charge and seats are limited.

Session: Making SharePoint Online Multilingual, With PointFire and Without

Session details here



14 November: M365 Chicago

Session: Details to be announced soon!


Other conferences we're sponsoring:


14-15 October:  European SharePoint Conference Online


24 October: CollabDays - Benelux


27 October: CollabSummit Sponsor Day


30 October:  Virtual HOU365 Friday - Houston


We'd be happy to see you there.

Feel free to reach out to our sales team and book an appointment: sales@icefire.ca.


A few new languages

Because of recent changes to the Azure Translator Text API, a few new languages in the Western Iranian language group are being added to PointFire Translator.


Two versions of Kurdish are being introduced, Kurdish (Central), language code "ku", also called Sorani,  and Kurdish (Northern), language code "kmr", also called Kurmanji.

In addition, two languages of Afghanistan are being added.  
Dari (prs) and Pashto (ps).  PointFire Translator had already supported Dari and Persian, and Dari is also supported by PointFire 365 and SharePoint.  However, we had been using the same language code and translation engine for both of them, because they are so similar.  
Now Dari will use "prs" and Persian will use "fa" for Farsi, another name for the same language.

SharePoint 2019 Machine Translation Service is back!

When SharePoint 2019 was first released, its Machine Translation Service did not work.  The service could be installed and was running, but most attempts to use of it would result in the error message "The service application required to complete this request is unavailable. Try this operation again later. If the problem persists, contact your administrator."

The ULS log would have a message of "Unimplemented method" with a stack trace in Microsoft.Office.Web.Conversion.Framework.

Microsoft  has now resolved that error!  If you install the July 2020 or later Cumulative Update for SharePoint 2019 then the service is working again.

The Machine Translation Service is used by two optional features of PointFire 2019: the machine translation of user interface elements and the translation of classic pages and documents in SharePoint libraries.


These functions now work as expected.  The next release of PointFire 2019 will have some performance improvements that were delayed because of the difficulty in testing improvements to a feature that very few could use.

Because of the ongoing problem, all users of PointFire 2019 have been able to get free annual licenses for PointFire Translator, which can translate documents, pages, lists, and UI elements in SharePoint 2019 with higher quality than the free Machine Translation Service (although check out this secret setting).  PointFire Translator can also translate modern pages Excel, PowerPoint, and PDFs, which the Machine Translation Service cannot do, and covers more languages.  This free license program will soon end. If your free PointFire Translator license is expiring, contact us about transitioning to the Machine Translation Service or renewing the license.

How Microsoft Forms Sets the Display Language for Multilingual Forms


Microsoft Forms supports multilingual forms.  How do you create these forms, how do you know what language the form will be in, and what happens when the forms are in a Microsoft Forms web part in SharePoint ?

Creating a form in multiple languages is easy.  You can find detailed instructions on this page Send a form in multiple languages

Note that it must be a form not a quiz.  Quizzes cannot be multilingual.  Create the form in the default language first.  Then click on the three dots at the top right corner of the screen, and select multilingual. You can change the Primary language if it got it wrong, or you can add new languages.


If users are not going to be using a SharePoint webpart to fill it out, you have to be very careful with the choice of languages. Language codes have to be an exact match.  For example you get to choose between "français (France)" and "français (Canada)"  However you have to choose carefully because later on you will need an exact match with the browser language.

Then for each language, hover over the language name to make the pencil and trash can appear, and click on the pencil.


That will let you edit the text in that language, including the title and the questions.


Let's try the form in different languages now.  If your browser's language preference, or one of the list of preferences, matches one of the languages of the form then the form's user interface and content will be in the selected language.  However this must be an exact match.  If the form language is "français (France)", which means language code "fr-FR", and your browser is "français (Canada)" (fr-CA) or even "français" (fr), it WILL NOT MATCH and the form will be in English.  If the form language is "Deutsch", which means  language code "de", and your browser is "Deutsch (Deutschland)" (de-DE), it WILL NOT MATCH and the form will be in English, or in whatever is the Primary language of the form.

You can override the user's language preference by adding the "lang" parameter with a language code to the URL, for example adding "&lang=de" for German.  Again, an exact match is needed.  It will then ignore the user's browser configuration and show the form UI and content in German.  If you set your language preference in your Office 365 user profile, it always ignores that.

When the form is on the user's screen they can always click on the language toggle in the upper right hand corner of the form.  Selecting a language from that dropdown will override the browser language and the URL parameter and show the form UI and content in that selected language.


In SharePoint modern pages, you can add a Microsoft Forms webpart to the page.  In what language will the form display?  If your user profile does not have a language preference set, then it will follow the browser settings, otherwise it will follow the user profile settings, the same thing as what SharePoint's user interface does.  Good news! It does not need to match exactly, "fr" or even "fr-CA" browser setting will match "fr-FR" if that is the MS Forms language.  Bad news! The forms language code must exactly match the SharePoint language code.  So for instance the language code for French in SharePoint is "fr-FR".  There is no way to match "fr-CA" in MS forms.  Even worse news.  MS Forms supports one version of German, apparently with language code "de", while SharePoint supports one version of German, with language code "de-DE".  They don't match, and there is no way to make them match.  It's a bit hit or miss, for some languages both products use the same language code, and for some they don't. For example both support Welsh, and in both cases they happen to use the language code "cy-UK", even though the language code "cy" would have been perfectly standards compliant.

If you put a "&lang=de" in the URL that is provided to the Microsoft Forms webpart, it makes no difference, it ignores the parameter.

When the form is on the user's screen they can always click on the language toggle in the upper right hand corner of the webpart.  Selecting a language from that dropdown will override the browser and the profile and show the form UI and content in that language, without changing the language of the UI of the rest of the SharePoint page.

All of this would be easily fixed if MS Forms were to try a little harder to match the browser language or the SharePoint language.  The best way would be if the exact match doesn't work, try to match the neutral culture ancestor of the languages (in most cases a two-letter language code).  If the neutral culture of the form were to ba matched with the neutral culture of the Browser language for standalone forms, or the thread's current UI culture in the case of SharePoint webparts, then "de" would match "de-DE" and "fr-CA" would match "fr-FR".

If you agree, you can vote for the User Voice idea to that effect.


I Know What I’m Doing, But…

One of my colleagues suggested to me that I should try Collab365’s MicroJobs and then write a review about it.  I subscribe to some Collab365 mailing lists, they have daily summaries of all the important news and blogs, and they organize some great conferences.  It’s one of the few automated emails from outside my company that I read carefully every day, motivated by FOMO.

The newsletters and the site mention MicroJobs all the time and I keep thinking that sounds like a good idea but it’s not for me.  But my colleague insisted I should do it and Fraser Beadle of MicroJobs joined her in gently coaxing me to try it.

I’ve been working in the SharePoint and Office 365 space for over 10 years.  I run a SharePoint product company with all sorts of SharePoint experts working for us, I give talks at conferences, I write blogs, I Know What I’m Doing.  Any issue that comes up I can just search for some blogs and follow the instructions.  I don’t need hand-holding.

And yet when I tried to think of current nagging problems that might be solved in 30 minutes, I could think of several of them.  I couldn’t decide which one to try, so I signed up for two of them, very different types of problems.  In both cases the format was a 30 minute Expert Call.  MicroJobs offers a whole range of other formats, and a lot of fixed-price deliverables, for in all sorts of Microsoft areas, but for me the short Expert Call format was what suited me best.

Can the problems really be solved in 30 minutes?  If not, the site has a useful feature, “Custom extra” where I can ask for more time if required.  Bad news for them, good news for me, in both cases the experts solved my problems in a few minutes, walked me carefully through the solution, and then had time left over within the 30 minutes to solve a second problem.

The first problem I had was for an internal SharePoint site, used by our company's Marketing and Sales staff, with one list in particular where they wanted to change the order of the columns in the New and Edit forms, and to hide some columns.  I don’t think it’s a good use of company resources to use our developers’ time or our client support staff to do this sort of thing for our marketing and sales staff, which would mean they have less time for our customers.  Typically Sales & Marketing fend for themselves or I do it for them myself.

I had tried to do it myself.  There were complications and the usual way to do it didn’t work.  Sometimes they use Modern view and sometimes they use Classic, so that limits how it can be done and there were issues with content types and site columns.

I could have researched it myself and spent a lot of time reading blogs, found solutions for slightly different problems that may or may not apply to me, or looked up some relevant conference slides or a YouTube video.  But why do that when someone who wrote the blog or gave the conference talk or recorded the video is probably on MicroJobs and willing to do this for me?

In this case an experienced MVP was willing to do this for less than what I would pay a taxi driver for the same amount of their time.  I don’t know why they do it.  When I was a freelance consultant, I charged a lot more than that and I wouldn’t have agreed to do 30-minute assignments, not worth the overhead.  Take advantage of it before they realize they should charge more!

I headed to jobs.collab365.community and gave it a shot.

The MicroJobs site is easy to use.  I use other freelancer sites for other things, and this one is easier, and the level of experience of the freelancers on the site is noticeably high.  Log in using a Microsoft account, or LinkedIn, or Google, or Facebook, or an email.  Then search for what you’re looking for.  You’ll see offers that some of the experts have posted.  You can also post a request and have people offer to help you, or you can find the right person first and ask them whether they could do the task that you need done.

I contacted Nick through the site, which has a handy chat interface.  We discussed it a bit, then set a time, and later on we changed that time a bit.  The MicroJobs site has a built-in videoconferencing and screen sharing tool (based on Jitsi Meet it looks like) that works really well right in the browser, and lets you talk and share your screen.

I shared my screen and Nick understood right away what I wanted to do, a list of about 10 changes.  It had to be done in classic mode.  We ran into a couple of small difficulties, which reassured me that the problem was not obvious, and they were solved right away.  I’m not particularly fast in doing the "click there", "type this", and yet we were done in 15 minutes.  Pro tip: have a second problem ready.  OK, I said, if there is more time here is something else I was planning to do: add a new content type to the list, with a subset of the columns and the ability to promote a record from one content type to the other.  That turned out to be a bit more difficult.  None of the columns were site columns but this would have to be a site content type.  No problem.  Nick guided me in creating a site content type that inherits from the same parent as the other, then modifying its instance within the list.  He reassured me that we could omit a required column.  Within a few minutes we were all done and had 5 minutes left to test that it worked for new and edit forms in both classic and modern mode, and that we could promote an item to a different content type.

Verdict, on a tricky problem that I couldn’t figure out, we got twice as much done as I expected, and I had to compromise on nothing.  The payment gateway released the money when he and I agreed that the work was completed.  After it was all settled, the invoice came straight from the expert, not the site.

The second problem was one that had to do directly with one of our products, so I am being be less specific as to the details.  For our machine translation product for modern pages, we provide instructions as to how to call it from a Power Automate flow so that a page can be translated as soon as it’s published.  But I wasn’t sure whether the instructions we planned to provide would follow best practice in terms of security and good governance.  Through the MicroJobs site I found Matthew, an expert in Power Automate, another MVP, who also knew a lot about Azure app registration including the less common authentication method that we were using.

Chatting with Matthew prior to giving him the task, he seemed to understand quickly and he asked me whether I knew about a different feature of Power Automate, and said we should have a chat about it.  I knew a bit about that other feature and I didn’t think it applied to this. I dismissed the suggestion.  I didn’t want to change the software, I just wanted best practices for deployment.  Perhaps we could look at that other feature later if there is time.

Matthew accepted the job, he set it up with Teams and we started the call at the appointed time.  He started out saying "I just want to show you this feature, I think it will help you."  OK, I thought, let’s have a look at it for a few minutes and see why he’s so keen on it.  It turns out it was exactly what we needed.  We didn’t have to explain away the security issues, this just solved them and we didn’t have to change a single line of code.  It was also a lot more flexible and scalable.  This is way beyond my expectations.  This is why it’s useful to speak with experts, they don’t just answer your question, they solve the underlying problem, and combine their knowledge and experience to analyze it and to think of the best way to solve it.  I would never have thought of this by just Googling answers to my how-to questions.  We started implementing his advice right away.

Just before the part where he was showing a sequence of steps with a detailed example, Matthew asked whether he could record the part of the session so he could send it to me.  Very thoughtful, I can use it for later reference if I forget some of the steps.

This MicroJob had a significant impact at very little cost.  This is definitely something I would do again.

You get to rate the freelancer.  These ratings are made public.  The freelancer also gets to rate you.  That is not public as far as I know, but might be available to other freelancers.  Apparently I’m “laid back” and “easy to work with”.  Nice to know.

But what happens if things do not go as well?  Thanks to my mistakes I got a tour of the dispute resolution and order cancellation process.  Apparently, I ordered the same MicroJob twice.  Not only that, but I marked it as delivered  twice, I convinced the consultant to mark the second one as delivered, thinking that he had forgotten and it was the same job, and I even paid for it twice.

I don’t think anyone had ever done all of that before, so when I contacted the freelancer to cancel it after realizing my mistake, he wasn’t certain how to fix this, and tried the dispute resolution mechanism.  

The dispute resolution mechanism is nicely thought out and goes to the dispute resolution team.  In this case, they apparently thought there was a better way to handle this.  They cancelled the dispute to reinstate the job and then the freelancer initiated a cancellation by mutual consent.  

Cancellation by mutual consent is very straighforward.

I received an email, followed the link to the site, and consented to the cancellation.  The money was immediately refunded to my credit card.  


The final verdict is that the process is well thought out, the people and service are high quality, it is a bargain, and it is essentially risk free.  Some day they will figure out that they could be charging higher rates, but for the time being take advantage of it.

Disclaimer:  I am not being paid for this review, but the people at Collab365 encouraged me to try it and to review it.  They offered to reimburse me for my first MicroJob but I declined.

Supporting another 12 Languages

In the most recent version of PointFire Translator (beta) we are introducing new or enhanced support for 12 new languages.

Of these, four are languages that are supported by SharePoint.  Irish and Kazakh languages are now supported for machine translation.  That means if your SharePoint site supports Irish or Kazakh, PointFire Translator can now translate its pages, documents, and lists, and PointFire 365 will automatically filter and/or redirect as appropriate.  If you want to translate the user interface, contact us, one of the steps is different for those languages than for other languages.

PointFire Translator now supports European Portuguese and Brazilian Portuguese as two separate languages.  Before this, the same translation engine was used for both, a neutral Portuguese that was actually closer to the Brazilian version.

Several new languages have been added to PointFire Translator which are not supported by SharePoint, including Māori (New Zealand), and five languages from India and Pakistan: Marathi, Gujarati, Punjabi, Malayalam and Kannada.  PointFire Translator will happily translate to or from those non-SharePoint languages, but PointFire 365 will be unable to filter by that language code.

All of those new languages have Neural Network engines behind them.  Irish, Brazilian Portuguese, Marathi, Gujarathi, and Māori have a customizable engine, meaning you can re-train it with your own documents to improve the translation quality.


The other two languages, or rather one language and two scripts, are ones that PointFire Translator had supported before and which had stopped working and discontinued.  In preparation for the Galactic Collaboration Summit we decided to brush off our Klingon translator.  This is where we discovered that there had been an undocumented change to Microsoft's Klingon language codes.  So we are happy to announce that we have reinstated Klingon (Latin script) and Klingon (pIqaD script).  If you choose the pIqaD script, make sure that you download a font that supports it, and change the font on the document.  This language only has a statistical translation engine, not a neural translation engine, so the quality is not very good.  But to paraphrase Samuel Johnson, it is like a dog's walking on his hind legs. It is not done well, but you are surprised to find it done at all.

If you're keeping count, that is 73 languages in total.
Office365™ - office.microsoft.com IceFire Studios is a Microsoft Partner SharePoint™ 2013 - Microsoft.com