Monday, September 4, 2017

Setting up MYSQL for remote access

This setup of MYSQL would allow you to putty into a linux box where your MYSQL is installed and run SQL queries remotely. My development rig is within my firewall, so I basically connect to the same network wifi/network band.

Lets get started, first is you need to install openssh on your ubuntu linux box:

    sudo apt-get install openssh-server

This will take you through a wizard that will allow you to setup your openssh.

Now from your remote PC/laptop, test if you can putty into your linux box:

    putty <ip> with root/password

If ssh is denied, you need to setup root password and settings again on the sshd_config file:

    1. Test if you can ssh login with root credentials on localhost/linux box:

        ssh -v localhost (you should get an error that permission is denied because by default
                                   root login is disabled)

    2. Edit the sshd_config file:

        nano /etc/ssh/sshd_config

    3. Navigate to Authentication section, and edit PermitRootLogin

        PermitRootLogin yes

    4. Restart the ssh service

        sudo service ssh restart

    5. Test if a new session will be created when:

        ssh -v localhost (this should now succeed without errors and session is created)

    6. Try to remote-in to the linux box, this time via putty, and you should be able to login via ssh.

Next you need to configure your MYSQL conf file with the IP address of your linux box:

    nano /etc/mysql/mysql.conf.d
    edit the bind-address, as: bind-address  =  192.xx.xx.xx

Now you need to restart your mysql service:

    sudo service mysql restart

Next step is you need to configure root to have localhost and remote access to your mysql. Start mysql again using root:

    mysql -u root -p

Next is we need to create a root user for the following hosts:

    - localhost
    - %
    - 192.xx.xx.xx

Create root user for the hosts listed above:

    create user 'root'@'localhost' identified by password 'pword';
    create user 'root'@'192.xx.xx.xx' identified by password 'pword';
    create user 'root'@'%' identified by password 'pword';

Grant all privileges on all databases to the hosts above:

    grant all privileges on *.* to 'root'@'localhost';
    grant all privileges on *.* to 'root'@'192.xx.xx.xx';
    grant all privileges on *.* to 'root'@'%';

Now, we need to flush all privileges you just created for root:

    flush privileges;

To test the mysql remote, putty into your linux box from a remote PC via putty. Once logged in, connect to mysql via:

    mysq -u root -p

This should bring up a screen similar to below:



You have to note though that both your MYSQL instance installed on your ubuntu linux box and your PC/laptop should be connected to the same wifi network.

In the next article, we will connect IntelliJ with your MYSQL installation from this post.


Installing and Re-installing MYSQL on UBUNTU Linux

Recently I've started to setup a Development environment at home that would have some technologies for my next project - Spring MAVEN with JPA and running on JBOSS connected to MYSQL and using IntelliJ IDE for development.

During this setup process, I had to go through install and reinstall process of MYSQL and a lot of online googling to find answers on how. Unfortunately, there weren't a lot of guides out on the internet helped me (although the instructions from digital ocean were pretty good) in a comprehensive/complete way.

A lot of the setup install/re-install steps were scattered over several articles, stack exchange, stack overflow, askubuntu resources. So, I started to document this process myself and this is what I came up with.

To start, I am just setting up MYSQL for root user, which would be the case since this Development rig is only for my home setup and is not meant for any production environment. The elevated access of root also makes it easier for setting up other stuff that I need for my project.

First step is to sudo as root together with the root user's environment:
      sudo -i

Once logged in as root, you need to install mysql:

      sudo apt-get install mysql-server mysql-client mysql-common dbconfig-mysql

If you have a previous mysql installation, you need to remove all instances of mysql:

      1. Backup any my.cnf fiels you have and copy to your personal folder:

             cp my.cnf /home/users/someacct/Documents/my.cnf.bak

      2. Remove mysql folder:

             rm -rf /etc/mysql

      2. Remove mysql:
           
             sudo apt-get remove --purge mysql-common mysql-client mysql-server dbconfig-mysql

      3. Remove packages and dependencies of mysql:

             sudo apt-get autoremove
             sudo apt-get autoclean

After installing mysql, install mysql-workbench (mysql GUI tool):

      sudo apt-get install mysql-workbench

Next run the mysql_secure_installation script to address security concerns in a default mysql installtion:

     sudo mysql_secure_installation

The previous script will go through a wizard that would ask about permissions related to root, allowing you to change the root password and if mysql can be accessed outside localhost.

Now, test your mysql installation with root:

    mysql -u root -p

This would take you to the mysql prompt where you can begin to to use mysql, you can test it by running the command:

    show databases;

This should show all the default databases (including mysql) that comes with your installation.

In my next post, I will go through the steps that I went through to setup mysql for remote access.


Sunday, September 14, 2014

Complete Custom Test Automation Framework - Using C#, MS EXCEL and Telerik Testing Framework - Part 1

In this series of posts, I will share with you a complete custom test automation framework that I built using Excel as a test driver, coded in C# on top of the Telerik Testing Framework, that you can use for your own test automation. You are free to extend this solution subject to GNU Public Licensing.

Okay, lets start....First off, when designing a test automation framework, you need to understand the AUT (or application under test) that you need to automate. Based on my personal experience (your's may vary depending on what your application is etc.), I have found that Unit or Acceptance tests are best when you have more access to your application's code and is a lot more stable and easier to maintain.

I am a huge fan of this diagram, and it's a great starting off point when discussing options available to you when you need to start writing your own automation framework (source: http://www.velocitypartners.net/blog/2014/01/28/agile-testing-the-agile-test-automation-pyramid/):



Fig 1- Agile Test Automation Pyramid

However, there are a lot of times too that you may need to write UI automation tests (integration tests) and understanding which tools out there suits your need, depends on several factors.

I've compiled a table below of some of my personal experiences with these tools:

Test Automation Tool
Pros
Cons
Coded UI
·        Tightly Coupled with Visual Studio.
·        Great for recognizing UI objects especially WPF controls and MS-native controls.
·        Can be integrated with other MS Testing tools – Load Test, Unit, Web Performance Tests.
·        You can code using either C# or Visual Basic
  • Tightly Coupled with Visual Studio. You can’t use it outside of Visual Studio.
  • Needs some work in terms of recording and playback, especially when debugging code and managing multiple UI maps.



Test Complete
·        Superior object recognition capability.
·        You can code using a variety of scripting options – C#Script, Jscript, VBScript, etc.
·        Can be integrated with other SmartBear tools like AQTime (used for profiling and detecting memory leaks).
·        Has great UI for recording/playback and debugging code.
Not open source, no free option.
Telerik Test Studio
·        Superior object recognition capability.
·        You can code using a variety full blown C# code or Visual Basic code which is awesome if you are already coding in either languages
·        Has great UI for recording/playback and debugging code.
·        Works awesome when recognizing complex Telerik UI objects.
·        Has an open source testing framework – Telerik Testing Framework that can be used outside of the Visual Studio integration.

  • Has some minor nuances that you need to be aware of.
  • A bit on the heavy side and comes with a lot of software/tools that you may not need or use.



I did not include in this table Selenium because although it is great, and would be my first go to tool if I'm doing test automation for a web-based app. It is not a full fledged tool (does not have a strong UI support for debugging your code), and obviously if you are learning to do automation, this can be difficult at first to learn as it forces you to code right away. QTP may be the grand-daddy of all UI testing tools out there, but I find a lot of issues with QTP when recognizing new UI controls. Plus it is extremely expensive compared to these other tools. So, unless you have very deep pockets to invest in you test automation right out the gate, this may not be the right tool for you.

After looking at the pros and cons of using these tools, I've decided to pick and use Telerik as my solution because of the following:

1. Tight integration with Visual Studio. And since the web-app that I was tasked to automate was built in .NET, this was a natural choice for me.

2. You have C# language behind you. You're not limited to just a scripting language, but the full stack of C# .NET technology behind you that you can use at your disposal. For example, you can use .NET reflection here with relative ease.

3. Has an open source option in the form of the Telerik Testing Framework. You are free to create and distribute your code (of course subject to some restrictions in the EULA) as you need.

In my old job, my primary responsibility as Senior Test Automation Engineer was to evaluate tools and recommend the best technology stack appropriate for our needs (and obviously also cost effective at the same time). Which is why we have a lot of unit tests, and SDETs code integration/acceptance tests using Specflow and MSTEST, and of course Telerik Test Studio as our UI testing solution.

In this post, we will start looking at how a similar (because the one that we're actually using back then for our company obviously cannot be made public here and of course uses a more advanced hybrid solution) custom test automation that I earlier built. 

Lets get started...

First you need to have Visual Studio 2012 or better installed in your machine. Start Visual Studio and create 2 Projects in Visual Studio. The first project is a class project, name it as CustomTestController. This project will contain all your code for running your test cases, we call this your Test Controller. 

Next, you need to create a second Visual Studio project, this time you need to create a Console Based Application Project. You should name this project as TestRunner, and this project will basically be your test runner, responsible for instantiating your Test Controller and managing your environment.

To understand further the design of this custom test automation framework, I created a diagram below with the different components working in this solution:


Fig. 2 - Custom Test Automation Framework Design


Other parts of the solution - Test Driver, Test Cases, Test Config and Object Repository are XML based files that are built and managed using EXCEL. In the next post, we will delve more into the structure of these files as well as how to manage them.

By the third post, I will post in Github the full source code for this solution, that you can fork and use as you need. Thanks and see you next time.


Monday, September 1, 2014

Pivotal Tracker API v5 - Create your own Visual Studio Plugin

If you're into Agile software development, then most likely you or your company is using a tool that will let you manage your user stories. There are a lot of Agile project management software out there and I've had great experiences with 2 of the most popular - JIRA and Pivotal Tracker.

In this blog, I will share with you a sample project that I created that will let you hook up to your project in Pivotal Tracker. I will go through the basics of connecting to the Pivotal Tracker API v5, getting your tokens and credentials and walking through the sample code I created which grabs all the user stories in my project and displaying them in Visual Studio.
Lets get started...
As I mentioned, Pivotal Tracker is a great agile project management software. A quick wiki on Pivotal Tracker:
"Pivotal Tracker is Pivotal Labs' software as a service product for agile project management and collaboration. In July 2011,[dated info] Pivotal Tracker had over 250,000 registered users.[7]
The tool includes file sharing and task management, velocity tracking and iteration planning; release markers; and progress charts. There is an API[8] for extensions and third party tools.[9]"

What I like most about Pivotal is the fact that you can extend its functionality and directly plug in using your development tool. Although I am showing you how to use Visual Studio to plug in to Pivotal, you can you whatever your favorite code language is.

To start, you need to have access to Pivotal Tracker to be able to go through this sample project. For this blog, I just created a Sample Project, with a 60-day trial version. This project may not work after 60-days because the trial may have already expired, but you can use the project with a few minor tweaks to access your own project.

Below, shows a sample Pivotal project, with a few user stories:



Our Visual Studio Plugin will grab all these user stories in Pivotal and show them in a Textbox. 

Get your user credentials...

First thing you need to do is get your credentials, specifically your API token. To do that you need to either use curl via the command below:
$ curl -X GET --user username:password "https://www.pivotaltracker.com/services/v5/me"
This would display a JSON result with a bunch of information. What you will be interested in is the entry for "api_token" and the "id" under "projects".

Another way to get this is by going through "Profile" under your login name on the upper right hand side of your screen, similar to below:


Once you have your token and projectid information, you can plug this into your code.

Creating your Visual Studio Plugin...

The first thing you need to do is you need to go through the API reference from Pivotal Tracker, this will provide a ton of information you need to get you started, as well as enhancing this solution to satisfy your teams/company's need.

You can access the v5 API from this location - https://www.pivotaltracker.com/help/api

The next thing you need to understand is that the Pivotal Tracker API v5 returns a JSON result. For example, to grab all the user stories from your project, you can use the curl command below:

$export TOKEN=99cbf9c263dcb63e5505260822d647bd
$export PROJECT_ID=1157418
$curl -X GET -H "X-TrackerToken: $TOKEN" "https://www.pivotaltracker.com/services/v5/projects/$PROJECT_ID/stories"

This will display all the user stories in your project in JSON format. Notice that the JSON result follows a common structure, displaying such elements as id, story_type, name, description, etc.

In your Visual Studio project, you need to mimic this structure so you can bring over the data from Pivotal back to your C# classes.

Next you can start creating you Visual Studio project as a Windows Form project and begin to create the elements of the project as from the screenshot below:

The form only has a single element, a textbox with vertical and horizontal bars enabled. Notice also that the project has C# classes for Label, Owner, and Story. This is what the structure of the JSON result returns when you send an API call.

Other classes may be necessary depending on what the JSON result comes back with from your request.

After creating your project, you now need to create a method that will send the request over via an API call to Pivotal, this is the main logic that takes care of getting this data, and its shown below:

 public string GetStories(string project)     {
            string url = "https://www.pivotaltracker.com/services/v5/projects/" + ProjectId + "/stories";
            var uri = new Uri(url);
            // Create a new HttpWebRequest object.
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);

            request.Proxy = WebProxy.GetDefaultProxy();

            var httpWebRequest = (HttpWebRequest)WebRequest.Create(url);
            httpWebRequest.ContentType = "application/json";
            httpWebRequest.Accept = "*/*";
            httpWebRequest.Method = "GET";
            httpWebRequest.Headers.Add("X-TrackerToken", Token);

            var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
            
            Stream objStream;
           
            var streamReader = new StreamReader(httpResponse.GetResponseStream()).ReadToEnd();
            List<Story> stories = JsonConvert.DeserializeObject<List<Story>>(streamReader.ToString());

            string storylist=null;
            foreach (Story story in stories)
            {
                storylist += story.id + "," + story.name+Environment.NewLine;
            }
            return storylist;
}

Explaining the code ....

From the method above, the first few lines assembles the HttpWebRequest object that uses the URL that we will need to grab our stories.

The next few lines are important, they are responsible for the content, the header, and the type of request that would be sent over to the API:

            var httpWebRequest = (HttpWebRequest)WebRequest.Create(url);
            httpWebRequest.ContentType = "application/json";
            httpWebRequest.Accept = "*/*";
            httpWebRequest.Method = "GET";
            httpWebRequest.Headers.Add("X-TrackerToken", Token);

Notice also that you are passing a Token variable to the Web Request header information, this corresponds to the token you got from the first curl command you executed or from the Token on your Profile inside Pivotal Tracker.

Next, you need to issue the command to send the request over, and store the result to a variable:
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();

If you recall earlier, when you created the project, you had separate C# classes to handle the result of the API call, however, the data that's being returned is in JSON. How do you convert the data from JSON to proper C# classes? I used an open JSON deserializer framework from Newtonsoft as part of this solution.
Once that's plugged-in as a reference to your Visual Studio project, I used it to deserialize the result back as below:

            var streamReader = new StreamReader(httpResponse.GetResponseStream()).ReadToEnd();
            List<Story> stories = JsonConvert.DeserializeObject<List<Story>>(streamReader.ToString());

Notice that the result from the stream (in JSON format) is converted to string then deserialized as a C# class matching structure of a C# class you created:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace ManagePivotalProject
{
    public class Story
    {
        public string kind { get; set; }
        public string id {get;set;}
        public string created_at {get;set;}
        public string updated_at {get;set;}
        public string accepted_at {get;set;}
        public int estimate {get;set;}
        public string story_type {get;set;}
        public string name {get;set;}
        public string description { get; set; }
        public string current_state { get; set; }
        public long requested_by_id { get; set; }
        public long project_id { get; set; }
        public string url { get; set; }
        public List<string> owner_ids { get; set; }
        public List<Label> labels { get; set; }
        public string owned_by_id { get; set; }
    }
}
Once you have this information, you can then iterate through the structure so you can return it back to a Textbox that you can display as a result:

            string storylist=null;
            foreach (Story story in stories)
            {
                storylist += story.id + "," + story.name+Environment.NewLine;
            }

You should see something similar to screenshot below:



There you go....you have now created a Visual Studio plugin that grabs all the user stories from your Pivotal Tracker project! You can do so many things with this solution, you can extend it further so you can update your stories from within Visual Studio back to Pivotal, which might be useful if you have SDETs or devs that need to update the user stories and send it back with the updates to your Product Owner. Another extension might be to automatically grab these user stories, definition, and acceptance criterias and create Specflow feature files so your SDETs and devs dont need to write the tests themselves in Visual Studio. There are a lot of possibilities that you can do with this solution.

Hopefully you enjoyed this post and if you need the source code for this project, just put in a comment here and I will email it to you. Have fun!

Saturday, September 14, 2013

Using Specflow and Visual Studio to generate test summary report

Okay, so you've been using Specflow for your Acceptance Tests, you have started to build up your tests suites for different components of your System. Now, let's say you want to somehow consolidate your results in a report to show your boss, manager or just to show off to the guy sitting next to you....

You could run your tests individually and see the results in Visual Studio Test Explorer, or maybe use what Marcus did in his blog. Or maybe, you can create a Windows Form application that will customize and simplify the process for you, just make a few inputs and you're good...you have a pretty HTML report that you can show your co-workers.

Speflow reporting anyone?


I was in the same predicament when our company started embracing Agile as part of our Agile Transformation process. I recommended to them a headless type of testing framework that did not rely on mouse clicks, and screen element validation to do some sort of UI testing, which can sometimes be unreliable and a bit slow. Now, as we started to accumulate tests and feature files, I started looking at ways we can somehow report these tests in a format that can be read and easily understood by anybody in the organization.

Marcus had a great article that I came across and I started to play around with this. Although his approach was to create an external tool that can be accessed via Visual Studio. I wanted to extend this to other folks, like SDETs in our organization and also added a few options so you can parameterize inputs that goes into your report. That's how I came up with the solution/project that I'm sharing with you. If you just want the solution and dig in right through the code, send me an email and I'll forward you the complete solution plus the test project.

Building the Solution


Okay, lets get this started. First off, you need to have a Test Project where you have your Specflow tests wired up. For practice purposes, I just created a simple test project and had 4 scenarios wired up with Specflow as below:

Fig 1
As you can see from the screenshot above, this example is pretty straightforward. I have defined 4 scenarios, the first 2 are for adding two numbers, the 3rd test is for subtraction and the 4th test is for multiplying 2 numbers. Notice that I also put in some custom tags - positivetest, negativetest, notimplementedtest. I'll get back to that in a sec. Even more straightforward is the test execution step definition file for the test scenarios above (fig 2).

Fig 2


Also from the screenshot above, you can see that when I run these tests through Visual Studio Test Explorer (left hand side of the screenshot). I see some results related to these tests, which tests passed (green), failed if any (red), or not implemented (in yellow). Pretty nice, but as you might have thought, this list is gonna get really long as soon as you start piling up your tests. 

However, as you can see from the screenshots below, when you run the solution, you just need to provide a few inputs (Fig 3), or default to all (blank) and you can get a report similar to what is shown in Fig 4.

Fig 3


Fig 4
There are a few components to this - the Windows Form solution in Fig 3, the batch file that accepts inputs and of course MSTEST and SPECFLOW.EXE files. 

Configuring the solution


Once you lood the solution, you need to update the code behind the form, you can do this by right-clicking on the form once you load them in VS, or press F7. Scroll down to the lines that Load method in your form class and set up the paths to where you have your Test Project, the bin directory or directories where you have your DLLs, and where you want to store your TestResults.html file.


        private void SpecFlowReportGeneratorTool_Load(object sender, EventArgs e)
        {
            btnViewReport.Enabled = false;
            textSpecFlowProject.Text = @"C:\BlogPosts\SampleSpecflowProject\SampleSpecflowProject\SampleSpecflowProject.csproj";
            textDLLFile.Text = @"C:\BlogPosts\SampleSpecflowProject\SampleSpecflowProject\bin\Debug\SampleSpecflowProject.dll";
            textResultsFile.Text = @"C:\BlogPosts\SpecflowReportGenerator\SpecFlowReportGeneratorTool\bin\Debug\TestResults.html";
            initialResultFilename = textResultsFile.Text;
       
        }


The .csproj test file itself is just used to pick up the test project name, so its I included a few lines in the solution that you can use to configure this with your Company name and override the test project naming:

        private void btnViewReport_Click(object sender, EventArgs e)
        {
            if (MultiProject)
            {
                ReplaceTextInFile(textResultsFile.Text, "YourCompany Tests Execution Report", "SampleSpecflowProject Test Execution Report");
                Process.Start(textResultsFile.Text);
            }
            else
            {
                ReplaceTextInFile(textResultsFile.Text, "YourCompany Tests Execution Report", "SampleSpecflowProject Test Execution Report");
                Process.Start(textResultsFile.Text);
            }
        }


The actual replacing is done through the code below:

       public void ReplaceTextInFile(string filePath, string replaceText, string findText)
        {
            try
            {
                System.IO.StreamReader objReader;
                objReader = new System.IO.StreamReader(filePath);
                string content = objReader.ReadToEnd();
                objReader.Close();

                content = Regex.Replace(content, findText, replaceText);

                StreamWriter writer = new StreamWriter(filePath);
                writer.Write(content);
                writer.Close();
            }
            catch
            {
            }

The code above is again pretty straightforward, you basically just open the file, store it to the content variable and just do a Regex.replace() to search for a specific test, which in this case is the "SampleSpecflowProject Tests Execution Report" with "YourCompany Tests Execution Report". After the update, save it again and overwrite the same file, so when you open the HTML file the banner will read with your company name instead of the project name.

Once the program has all the parameters setup, it calls the batch file through the code below, with the parameters that we have assembled from the input text boxes.

        void worker_RunWorker(object sender, DoWorkEventArgs e)
        {
            start = new ProcessStartInfo();
            start.FileName = @"C:\BlogPosts\SpecflowReportGenerator\SpecFlowReportGeneratorTool\bin\Debug\SpecflowReportGenerator.bat";

            start.Arguments = ProcArguments;
            start.UseShellExecute = false;
            start.RedirectStandardOutput = true;
            start.CreateNoWindow = true;
            
            using (Process process = Process.Start(start))
            {
                using (StreamReader reader = process.StandardOutput)
                {
                    string result = reader.ReadToEnd();
                    Result = result;
                }
            }
        }

Putting it all together

Once you have the solution properly configured, the last step is configuring the batch file which is a cinch. The details are below:

@echo off
if Exist TestResult.trx del TestResult.trx 
if Exist %3 del %3

IF %2==-notags (
@echo on
"C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\mstest.exe" /testcontainer:"%4" /resultsfile:TestResult.trx 
goto specflow
)

IF %2==-withtags (
@echo on
"C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\mstest.exe" /testcontainer:%5 /resultsfile:TestResult.trx /category:%4
goto specflow
)

IF %2==-withmultipleproj (
@echo on
"C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\mstest.exe" %5 %6 %7 %8 /resultsfile:TestResult.trx /category:%4
"C:\SpecFlow\SpecFlow_v1.9.0_bin\tools\specflow.exe" mstestexecutionreport %1 /testResult:TestResult.trx /out:%3
echo Created results file - FixedResult.html
goto end
)

:specflow
"C:\SpecFlow\SpecFlow_v1.9.0_bin\tools\specflow.exe" mstestexecutionreport "%1" /testResult:TestResult.trx /out:"%3"
echo Created results file - FixedResult.html

:end

You need to set the path to where your Visual Studio is installed, specifically where MSTEST.EXE is installed. You also need to put in the correct path to where you copied/extracted/installed Specflow.exe. After that you're set.

The batch file is the one that does the heavy lifting of creating a TestResults.trx file which is normally generated when you run a test. It then goes to the appropriate option (-notags, -withtags, -withmultipleproj) so it can call MSTEST and SPECFLOW with the parameters you pass from your solution. Lastly, it generates a test results file (or FixedResult.html) that will show you the results of your test execution similar to what's shown on Fig 4. 

Conclusion 

Remember those tags i mentioned earlier? Those same tags can be used as inclusion parameters when you run your tests. Say for example, you want certain kinds of tests as part of Smoke Test, you just tag your Specflow scenario with SmokeTest (or any name that describes what the group is for) and input that name in the Inclusion Tags textbox when you run the solution, and voila! It will show you only those tests that fit that description as below:

Fig 5
And there you have it, a nice looking reporting tool using MSTEST/SPECFLOW with a few enhancements through our solution.

If you like this article and want to get your hands on the full source code for this plus the sample test project, just email me with your email address and I'll send it to you. 

Happy Coding!