mySphere Posts


At a high-level, a hybrid cloud is a mixture of private and a public cloud environments that work in tandem to run your workloads and apps. That definition sounds straight-forward, but when you dive in and start to work, you can find hybrid cloud architectures difficult to manage without the right tools.

You can watch a video of a real-world example to clearly understand what works well on a public cloud environment and what is best in a private environment.




I found it today when googling for some solution for my mac and it works!

The web is moving to HTTPS, preventing network attackers from observing or injecting page contents. But HTTPS needs TLS certificates, and while deployment is increasingly a solved issue thanks to the ACME protocol and Let’s Encrypt, development still mostly ends up happening over HTTP because no one can get an universally valid certificate for localhost.

This is a problem because more and more browser features are being made available only to secure origins, and testing with HTTP hides any mixed content issues that can break a production HTTPS website. Developing with HTTPS should be as easy as deploying with HTTPS.

link to github



IBM Verse is a powerful cloud-based email experience. It’s optimized for web browsers and mobile devices. It helps you focus on your top priorities and take control of action items. This one hour course will help you get started.



With the launch of Domino 10  knowledge of JavaScript and how to work with JSON Objects will be required for the new development model.

As i am working on a IoT project  i need to learn JSONata to manipulate lots of JSON objects.

JSONata is underpinned by the semantics of the location path inspired by XPath, the ability to format the output into any JSON structure inspired by XQuery, and the functional programming model inspired by Scheme.

The semantics of the location path inspired by XPath

  • Simple and intuitive syntax for selecting values within a structure.
  • Powerful predicate mechanism for complex queries and data joins.
  • Built-in functions and operators for manipulating and aggregating data.

The ability to format the output into any JSON structure inspired by XQuery

  • Standard JSON syntax used for constructing new objects and arrays.
  • Any valid JSON data is also a valid JSONata expression.
  • Allows template style queries to be constructed.

The functional programming model inspired by Scheme.

  • Implements the ‘metacircular evaluator’ described in SICP.
  • Supports user-defined functions (closures).
  • Turing complete.

The lightweight footprint of the runtime processor enables it to run in the web browser or as a node.js module with no package dependencies.

Take a look at



Today I gave a presentation about machine learning and IoT focusing on industrial maintenance. The event was the third industrial maintenance workshop in Belo Horizonte. An opportunity to talk about IBM Watson and IIoT to the industry in my region.


Data Science Machine Learning Uncategorized


Domino 10 beta 2 started last week. I download the files and wish to run it on docker.

I follow Tim Clark’s blog series about Domino 9.0.1 on Docker.

I change a few lines on dockerfile (just change the installation file name).

Another difference was the line to start the container. I add the port 8585 to do a remote setup

docker run -it -p 1352:1352 -p 8888:80 -p 8443:443 -p 8585:8585 –name Domino10Beta -v domino_data:/local/notesdata kenio:Domino10Beta2




I ordered two servers from SoftLayer to host some wordpress sites.

I tried to update the server using yum update but i got the error bellow

Loading mirror speeds from cached hostfile [Errno 12] Timeout on (28, ‘Connection timed out after 30001 milliseconds’)
Trying other mirror. [Errno 14] curl#56 – “Callback aborted”
Trying other mirror. [Errno 14] curl#56 – “Callback aborted”
Trying other mirror. [Errno 14] curl#56 – “Callback aborted”

The first problem: DNS server is not correct on resolv.conf

The second problem: The repos for yum was changed by softlayer and you need to revert the file CentOS-Base.repo to the original file.




Many people want to start their studies in neural networks and machine learning as a whole. So I decided to make a guide that I’m using to study these two technologies.

First, you have to choose a language. I chose Python.
Python can be downloaded through the Anaconda distribution, which in addition to packing in a functional way, still has a control panel for installations from other libraries:

Now, you’re going to need an IDE. Do I currently use Visual Studio Code

Don’t know Python?

Follow the Basic Tutorial category at It’s a start. There are many good courses in Coursera, Udemy, and EDX.

After doing the tutorial, some libraries are a “must  have” for machine learning:

-Numpy, library for arrays and mathematical functions:

– For plotting graphics and viewing data:

-OpenCV, for viewing and editing images via Python:

For those who want to deal with classic machine learning, we have:

-Scikit-learn, Python library with all sorts of algorithms:

-Weka, application with a graphical interface for reading data, preprocessing and machine learning algorithms: .

For those who plan to deal with neural / deep learning networks, it’s a different track.

There are four major frameworks: TensorFlow, Keras, PyTorch, and Theano. I use TensorFlow.
Do you want to understand how these initial networks work with a visual explanation?

The TensorFlow has a playground for you: .

After reading this material, it’s time to install the TensorFlow. His installation is a little nauseous, so READ. It will prevent further headaches, but in a summary it is:

-Install the CUDA Toolkit, and check that the system variables are correct (check even, there is a chance that you would not install it correctly)

-Install the CUDA Toolkit drivers-Install the cuDNN-Install TensorFlow, version CPU, or GPU (preferably have only one installation).

When you install, follow the step-a-step of the TensorFlow itself.

Installed? Tested?  Now you don’t know where to start?

TensorFlow himself has good tutorials to start dealing with him. Recommend two tutorials:

-A MNIST-based tutorial, a historical basis of handwritten digits:

-A tutorial for CIFAR-10, a historical base of 60,000 images of 10 different types:

You want more tutorials? It also has: Learn how to use the TensorBoard, manager and visualizer of TensorFlow’s neural networks.

Until you save the current state of the network to reload then you can:…/summaries_and_tensorboard

Do you want a site with historical and classic databases?

Access the UCI Machine Learning Repository:

Do you want a site with current and complex databases?

Create an account on Kaggle:


Do you want a list of datasets with the current state of art and other applications for these bases (including the MNIST and CIFAR-10)?

You have it here:

Do you find a base you want to work on?

You want to know how people are solving a particular problem? So get ready to read papers, get ready to read LOTS OF PAPERS, and they’ll probably be posted here:

Have you got any questions about how some network works?

Probable that the Siraj Raval has already explained:

If you want any more reliable sites for explanations of any network / architectural/problem solution that you have, I recommend the O’ Reilly Media (https: // and Medium (https: //

It also has the following list: course CS229, milestone in the area, but quite extensive.

-There are several courses in Udacity, beyond the nanodgrees, some paid, other free, it’s even difficult to choose which to study if you have any other questions about what or how to search, remember: Google Is Your Friend.

Thanks to Ayrton Denner for this guide.

AI Deep Learning Machine Learning


IBM publised a TechNote listing links and pdf files about GDPR for ICS products

See the TechNote here

Thanks to Robert Ingran for sharing this.



Command line tools are always welcome.  Today i need to setup my MAC to work with IBM Cloudant

Install curl on your MAC if you haven’t done it yet.

1 – You need to setup the Cloudant DB using your IBM Cloud account.

2 – Go to service credentials and get your username, password and hostname

"username": "xxxx-xxx-xxx-bluemix",
"password": "12345678901234567890",
"host": ""

3 – Test the connection

curl –v –u xxxx-xxx-xxx-bluemix ''

The command will prompt for the password: 12345678901234567890

You will see a lot off lines but pay attention on the line Authorization: Basic. This will be used to setup acurl (authorized curl) to avoid type the password every time you use curl.

* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
* Server auth using Basic with user ‘xxxx-xxx-xxx-bluemix’
* Using Stream ID: 1 (easy handle 0x7fabd0805800)
> GET / HTTP/2
> Host:
> Authorization: Basic asddadasdDASDdaDadaDSERQERQRQERQEDSGSFDGSFDG==

4 – Edit the .bash_profile or .bashrc file in any text editor. For example, from the command line, issue the following command:
open –e .bash_profile

5 – Add the following line to the file, then save the file.
alias acurl="curl -s --proto '=https' -g -H 'Authorization: Basic asddadasdDASDdaDadaDSERQERQRQERQEDSGSFDGSFDG=='"

6 – On the terminal run : source .bash_profile

7 – Issue a command using the new alias. For example, issue a command to view your account information.
acurl -X GET ''

TIP: The hostname is not friendly you can edit your hosts file and create an alias:

On MAC terminal type: sudo vi /etc/hosts

ping your hostname to get the ip address

add a line like this one bellow:

<ip address>  xxxx-xxx-xxx

Save the hosts file

test the connection

acurl -X GET '


Linux MAC


I meet several people when i was in Las Vegas. No one know about my city until i talk about 2014 Soccer World Cup and tell about Germany 7 x 1 Brazil game.

The video bellow,  show another face of the city.

Uncategorized web


For professionals who bill on an hourly basis—lawyers, accountants, and so on—time tracking is a critical part of the billing process.

A time-tracking tool that allows them to easily enter work hours and obtain summary reports at the end of the week or month is of critical importance.

In this tutorial, you will see the process of building a simple time-tracking tool and deploying it on IBM Cloud.