Page tree
Skip to end of metadata
Go to start of metadata

Moderator Nicolas Barbé

The goal: What are the limitations compared to what we can do in backend.



Common examples:

  • Social integration,
  • Twitter wall.
  • Google map

To not use IFRAME - then we request HTTP directly from the client side.

Problems with requests from the client

  • Authentication - you should not send authentication data (tokens or user info) from client for security reasons.. 
  • Often it is completely impossibel because the provider of the service denies it via CORS.

Would be nice if Magnolia made that easy on the server.

Problems with 3rd service requests: (from the client or the server)

  • You are dependent on the quality of service of the external service. IE what if the service is down? How does your website look?
  • What if the 3rdservice has bad QOS. An idea could be a generic cache queue that would buffer last responses? (either via app or other way.)

Other approach (infusionsoft) we cache infusionsoft info in our JCR so that it is always available. Same thing for posting to service - we can cache the request if the remote service is not available.

Depends on how important the service is.

A benefit is - less load on your server.

What are the important protocols:


XMTP (chat)


What about UCG interactions that take place on the author. How can this information be persisted back to the author.

Missing Processed Resources - because then to do anything dynamic you have to have a page or a componentn that an author needs to create - fragile, it could be moved. Processed Resources would let you - the Developer - just include the file where you want (in webresources) and you can rely on it to do server action for you.


Generic proxy for service requests

What if magnoila provided a generic proxy for service requests, so that the web developer could use that to make service requests without any Java codeing.

And then surface the results in the page either by standard templating (in a synchronous way) or via AJAX to better handle error situations. 

Generic queue for service requests

What if magnolia provided a generic queue for service requests to cover the QOS problems?


  • A projects code makes the request to the Magnolia queue instead of directly to the service
  • If magnolia makes a GET request and the server is down - the proxy keeps trying until it finally gets through.
  • If magnoila makes a POST request and server is down - same thing.

How hard would it be to build a generic queue? Or it could be a buffer or a proxy.

Karel - for this you could use any cache system. Such as? ngnx-cache.

Could this be a connected-service? In your light module - you configure a caching system for a specific endpiont.


What about progressive web apps - they often use a service worker to make the actual request. (of course this is just in the client)

AdminCentral / Backend

How to connect to external system? What features do we need in light development?

use case: Deploy new functionailty without restarting admin central - because authors are working.

What is use cases for working with Data. Bringing in data from external source and enriching it.

Services has some solutions to:

  • Define an action with javascript.
  • JS API as a proxy to the Vaadin communication layer - send events, respond to events, open apps.
  • Simpler REST endpoints.
  • No labels