In the Model-View-ViewModel (MVVM) pattern, the view model acts as a bridge between the view and the model, encapsulating the presentation logic and state.
In Jinaga applications, you achieve this result using the Watch
method.
Watch
creates an observer.
As the results of a specification change, the observer updates the view model.
The view model in turn notifies the view of collection and property changes, thus updating the user interface.
Create observable objects to store results that you want to display on the UI. You can use ObservableObject from the MVVM Community Toolkit.
public partial class SiteHeaderViewModel : ObservableObject
{
private readonly Site site;
[ObservableProperty]
private string name = string.Empty;
[ObservableProperty]
private string domain = string.Empty;
public SiteHeaderViewModel(Site site)
{
this.site = site;
}
}
In your view model, create an observable collection to store these results.
public ObservableCollection<SiteHeaderViewModel> Sites { get; } = new();
Every view model should have an observer. Define a field to manage it.
private IObserver? observer;
To set up your observer, call the Watch
method.
Do this inside of a method called Load
, which your view will call when it appears.
Pass in the specification, the starting point, and a lambda.
When new results become available, the lambda will be called.
public void Load()
{
// Define the specification
var sitesByUser = Given<User>.Match((user, facts) =>
//...
);
Sites.Clear();
observer = jinagaClient.Watch(sitesByUser, user, projection =>
{
var siteHeaderViewModel = new SiteHeaderViewModel(projection.site);
Sites.Add(siteHeaderViewModel);
projection.names.OnAdded(name =>
{
siteHeaderViewModel.Name = name;
});
projection.domains.OnAdded(domain =>
{
siteHeaderViewModel.Domain = domain;
});
return () =>
{
Sites.Remove(siteHeaderViewModel);
};
});
}
Within the lambda, create a new observable object for the new result and add it to the observable collection.
Then call OnAdded
for every observable list in the projection.
Provide a lambda that will be called with that list changes.
Return a lambda that will be called whenever a result is removed.
You can keep track of the progress that the observer is making.
The observer provides two Task
properties that will resolve to tell you that it has finished.
The Cached
task resolves when the results are loaded from the local store.
It resolves to true
if the results were in cache, or false
if they were not.
Then the Loaded
task resolves when the results are fetched from the replicator.
Here is a method that demonstrates one way to do this.
[ObservableProperty]
private bool loading = false;
[ObservableProperty]
private string error = string.Empty;
private async void Monitor(IObserver observer)
{
try
{
Loading = true;
bool wasInCache = await observer.Cached;
if (!wasInCache)
{
await observer.Loaded;
}
Error = string.Empty;
}
catch (Exception ex)
{
Error = ex.Message;
}
finally
{
Loading = false;
}
}
The view might offer pull to refresh.
When the user indicates that they want to check for new results, just call Refresh
on the observer.
private async Task HandleRefresh()
{
try
{
Loading = true;
await observer.Refresh();
Error = string.Empty;
}
catch (Exception ex)
{
Error = ex.Message;
}
finally
{
Loading = false;
}
}
If you want the view to automatically refresh whenever new information is available, you can call Subscribe
instead of Watch
.
Subscribe will keep the connection to the replicator open.
Changes will be pushed in real time.
Use this with caution. Holding a connection open puts extra load on the replicator, especially if there are many clients.
When the view disappears, the view model should call the Stop
method on the observer.
You can do this in a method called Unload
, which the view will call when disappearing.
public void Unload()
{
try
{
observer?.Stop();
observer = null;
Sites.Clear();
}
catch (Exception ex)
{
Error = ex.Message;
}
}
Failure to stop the observer will cause a memory leak on the client.
And if the observer was created with Subscribe
, then it also will hold connections open longer than necessary.