Drupal 8 Commerce is on the Way! DrupalCon New Orleans 2016.

A lot of thanks to the commerce guys for contributing the Drupal commerce module to Drupal community, which took drupal to a different level in the CMS world. Its very exciting, Commerce 2.x which is the Drupal 8 version of drupal commerce. As like any other drupal developer / architect, I am also excited about Commerce 2.x

Thank God, I was one of the fortunate ones to attend the Commerce Guys session on DrupalCon New Orleans 2016, the very first session after the release of ‘8.x-2.0-alpha4’  version of drupal commerce. It was an amazing session, which made a lot of things clearer,a lot of unanswered questions were answered by the Commerce guys themselves. Here we are going to discuss about the take away of the Commerce Guys session on DrupalCon New Orleans 2016.
 

No more Commerce Kickstart in Drupal 8, Why?

No more Commerce Kickstart in Drupal 8, because Commerce 2.x is developed as loosely coupled by using a lot of PHP 5.4+ libraries like tax [tax library], addressing [addressing library powered by Google’s dataset], intl [internationalization library powered by CLDR data], zone [zone management library]. Using composer we can create a new site as like commerce kick start in Drupal 7. For that we have to use the following command.

$ composer create-project drupalcommerce/project-base mystore --stability dev

 The above command will download Drupal 8 + Commerce 2.x with all dependencies to the ‘mystore’ folder. Before running the above command please make sure you have installed composer globally on  your system.

How to install Commerce 2.x on existing site?

For Installing Commerce 2.x in existing Drupal8.x site,  we need to do the following steps.

Step 1: Add the Drupal packagist repository which allows Composer to find Commerce and the other Drupal modules. For that run the following command, 

$ composer config repositories.drupal composer https://packagist.drupal-composer.org

 

Step 2: The following command will help us to download the required libraries and modules (Address, Entity, State Machine, Inline Entity Form, Profile).

$ composer require "drupal/commerce 8.2.x-dev"

 

Step 3: Enable the modules “commerce_product commerce_checkout commerce_cart commerce_tax” , Use drush or drupal console

Read more about the commerce documentation on: http://docs.drupalcommerce.org/v2/

The above steps help you to install the commerce 2.x on the existing site. Moreover  that in Drupal 8 commerce 2.x  requires lot of contrib modules developed and contributed by commerce guys themselves. The following ones are the famous ones.

  1. Inline Entity Form
  2. Address
  3. Profile
  4. State Machine

What is new in Commerce 2.x?

  1. Currency Management :  Integrated the Unicode CLDR project. So it is easy to use the currency inside and outside the US.                                                                                     
  2. Price Formatting: Price format  varies from country to country and language to Country.  Like German(Austria) Vs German(Germany)                                                                       
  3. Taxes: Imports the tax rates automatically on country basis depending upon the product like B2B, B2C, digital or physical products.                                                                           
  4. Stores: Multiple stores in same Drupal Commerce Instance, hence there will be majorly two types of use cases, first one is where User can create their own stores based on their own products. Which means multiple cart from multiple stores, from the buyer’s perspective. The Second one being this, that the Company has billing location in multiple countries.                                                                                                                                
  5. Products: Instead of nodes we hare having Product entity itself. On the production creation page using inline entity to create individual product variations. And those variations holds the SKUs and sets of attributes.                                                                  
  6. Product Attributes: In Commerce 2.x product attributes are its own entities, So it is easy to edit, delete, create new attribute from. Suppose you want to add the attributes to the product variation type,  you have to edit the respective production variation type and select the checkbox under the ‘Attributes’. So it is easy to use different form of modes to display fields.                                                                                                                           
  7. Order and Line Item Types: This will provide the option to have separate order types for different types of products like physical and digital products, event registration, training payment etc. So it s provide different checkout flow, different workflow, different logic  or different experience to the customers.                                                                                  
  8. Order Editing: Order is in step by step. Right columns provide the IP address  and the geo location of the order.                                                                                                            
  9. Add to Cart Forms: In the add to cart forms,we can have the variation attributes fields and line item fields as well. Now we have full control add to cart forms powered by Form display mode and field widget plugin.                                                                                   
  10. Shopping Carts: If you having multiple shopping carts, the ui will help to see and checkout each of them separately.                                                                                        
  11. Shopping Cart block: Easily customizable shopping cart block.                                           
  12. Checkout Flows: You can define your own custom checkout flows.                                    
  13. Improved Checkout UX:  It provides the option to do checkout as a guest user, or can also be registered as a new user while doing checkout.                                                       
  14. Payments: Under active development, Currently integrating Braintree and Authorize.Net as reference implementations.                   

Conclusion

Early in 2017 or even before Commerce 2.x will be fully functional. Commerce 2.x  with Drupal 8 will make a difference in the ecommerce world.

When can we start using the Drupal8 Commerce 2.x ?

See the bellow comment by Bojan Živanović

Thanks, that's a great summary. We're tracking the beta1 blockers here: https://www.drupal.org/node/27..., once beta1 gets released people will be able to start building production sites on Commerce 2.x. See you in the issue queues!

Image credits: https://www.flickr.com/photos/comprock/26392816934/in/pool-drupalconneworleans2016/

Send Message to Slack from Drupal

We moved to Slack few months back and the one thing that I love about Slack, is the integration of various apps with it. Most of the integration are out of the box like Google documents, Dropbox, Git and Bitbucket. But we wanted to integrate Drupal with it. Our need was to send a notification message to the #general channel whenever a new Blog post is published.

We start off by adding a Custom Integration from the Slack App Directory.

Salck custom integration

We will be using the Incoming Webhook as we will be sending data from our Drupal site to Slack.

Select the channel where you want to post the message or create a new one. Once the channel is selected it gives you the setup instructions. The most important part is the Webhook URL which we will be used in our Drupal site. You also get a level of customisation about the slack bot like the name and image.

Next we come to our Drupal site. Drupal 8 has a development release for the module which allows integration of Slack. Once the module is installed go to Slack configuration page at “admin/config/services/slack/config”. Here you will enter the Webhook Url as provided by Slack when the integration was added. Additionally you can also change the username and image of the username who will be sending the message to Slack.

Slack configuration


Try sending a test message from the Slack configuration UI and when everything is setup correctly you will receive a message in Slack.

Now, to send a message to Slack when a node is created or update add the following code:
 


/**
 * Send message via slack.
 * @param $node
 * @param $op
 */
function send_message($node, $op) {
 global $base_url;
 $config = \Drupal::config('slack.settings');
 $channel = 'test';
 $url = Url::fromUri($base_url . '/node/' . $node->id());
 $node_title = $node->label();
 $snippet_user_id = $node->get('field_user')->target_id;
 $account = Term::load($snippet_user_id)->getName();
 $username = $config->get('slack_username');
 $link = render(Link::fromTextAndUrl("here", $url)->toRenderable());
 $webhook_url = $config->get('slack_webhook_url');
 
 if ($op == 'insert') {
   $message = 'Snippet `' . $node_title . '`` was added by *' . $account . '*. Click ' . $link . " to view.";
 }
 else {
   $message = 'Snippet `' . $node_title . '`` was updated by *' . $account . '*. Click ' . $link . " to view.";
 }
 // This will send your message to Slack.
 \Drupal::service('slack')
   ->sendMessage($webhook_url, $message, $channel, $username);
}

 

Salck message


Finally we are trying to integrate Disqus comments with Slack so that when there is a new comment we get a notification about it and respond. 

There you go!
 

Get To Know About Postman Tool

Postman is a great tool for prototyping APIs, and it also has some powerful testing features. So, here I share how to integrate Postman's tests into your build automation to make it elite. I've used Postman in one of my projects as a way to interact with APIs which is also explained here. As a tool to setup complex HTTP requests, this is  much convenient than request specs, Cucumber, or hand-rolling them in even your favorite HTTP library.

A Little About Postman
Postman is a Google Chrome app for interacting with HTTP APIs. It presents you with a friendly GUI for constructing requests and reading responses. The people behind Postman also offer an add-on package called Jetpacks, which includes some automation tools and, most crucially, a Javascript testing library. This post will walk you through an example that uses those testing features. While they won't replace your focused unit tests, they do breathe new life into testing features from outside your applications. This makes it extremely valuable for functional testers or for developers who love to test outside-in.

HTTP VERBS generally used in POSTMAN
GET : Read a specific resource (by an identifier) or a collection of resources.
HEAD : Works same as GET, just returns the header.
PUT : Update a specific resource (by an identifier) or a collection of resources. Can also be used to create a specific resource if the resource identifier is known before-hand.
DELETE : Remove/delete a specific resource by an identifier.
POST : Create a new resource. Also a catch-all verb for operations that don't fit into the other categories.

HTTP Response Codes for Status
Using top 10 HTTP status codes

200 – ok – general success
201 – created – New resource has been created
204 – no content – Success and response body empty. The resource was successfully deleted.
304 – Not Modified – The client can use cached data
400 – Bad Request – The request was invalid or cannot be served. The exact error explained in error payload.
401 – Unauthorized – The request requires an user authentication
403 – Forbidden – The server understood the request, but is refusing it or the access is not allowed.
404 – Not found – There is no resource behind the URI
405 - Method not allowed
422 - Unprocessable Entity
500 – Internal Server Error

You have to get the POSTMAN from https://www.getpostman.com/ and download the google chrome POSTMAN extension and signup for free to create an account. After creating the account when you log in to the POSTMAN tool, you have to have the URI to test ,whether the URI gives the correct response or not ,after verifying the results and codes.

Steps for doing  the API Testing using POSTMAN.

Step 1: Open POSTMAN tool and log in with your credentials.
Step 2: Get the Request URL(Uniform Resource Locator) of the API.
Step 3: Paste that URL into the URL space given in the POSTMAN.
Step 4: Select the Method which you want to perform,For example GET,PUT,POST,DELETE.
(Note: The endpoint of the URL would be different for different Methods).

Step 5: Put the Headers with Key and Value field.
Step 6: Put the body part and select raw option ,incase you are using any POST/DELETE Method.
Step 7: Recheck the given URL and click on send button and observe the response in JSON format.

*Let us take a scenario of creating a new profile in a XYZ website and after creating that new profile we will try to retrieve its data from the DB using POSTMAN Tool.
 

Pre requisites:
1. First, user has to logout from the XYZ website,and we need to have a resource endpoint to which we are posting the new profile.

2. We would also need the body part of the profile what we are posting.

3. After successfully creating the new profile we have to make the Request URL from the response, to fetch that newly created profile.

Step 1: Get the request URL and paste in the URL space provided in POSTMAN.
ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles

Step 2: Select the POST Method.

Step 3: Put the headers and body part and select the raw option of input type for Body part.
Headers:

Content-Type  : application/json

Authorization : Bearer 60557ad5f4ddd047c846928642b7aab1b94f3681

X-CSRF-Token : eMb4uODH3rNwb6r-SUt6rb9mPWtu69kdoAjcbIGgOMQ

Body Part:

{"name": "ABCD",

"account" :

{ "is_new": 1, "email": "abcd@gmail.com",

 "password": "password" }

}

 


Step 4: Click on the Send button and observe results.

Response would come in JSON Format:

Status code: 201 Created

{  "id": "438b5097-57c9-411f-9e4f-abd04192b2d1", "href": "ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1",

  "type": "profile",

  "account": {

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/account",

 "data": { "id": "d76c0eda-2525-43d0-b9bf-69fe84860cfc",

"name": "1qw111ww2.singh",

"email": "1qw111ww2.singh@david.com",

 "created_time": "1463050948",

 "last_login_time": "1463050948",

 "status": "1",

 "email_verification_status": "0",

"phone": "",

"phone_verification_status": "0",

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/accounts/d76c0eda-2525-43d0-b9bf-69fe84860cfc",

"type": "account",

"profile":{"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/accounts/d76c0eda-2525-43d0-b9bf-69fe84860cfc/profile",

"data": [

{

"id": "438b5097-57c9-411f-9e4f-abd04192b2d1",

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1",

"type": "profile"

}

 ]

},

"session_id": "cXH295AMUKo5nx40aNhIwtNmo3c9x8WQrLu9uCMSBQ8",

"session_name": "SESS15dfa26f31ea72e21097a903a2b1b263",

"token": "dTbg9wLVhpqT2ShezRq7orHl5zTJxEwbPlflsxIJinY"

}}}</pre>

 

Screen shot of the Response:

Postman Screenshot

Now We have successfully created a new profile.Let us now retrieve the profile data that we just have created,from the DB through below request URL.

Request URL:
ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1

Response:  200 Success

{

 "id": "438b5097-57c9-411f-9e4f-abd04192b2d1",

"name": "A2?!@B=22A",

 "cover_picture": {

"thumbnail":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/thumbnail/public?itok=zcW4M1JC",

"medium":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/medium/public?itok=2poEpR1K", "large":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/large/public?itok=OcLV1fUn"

 },

"about": null,

"short_bio": null,

"gender": "",

"display_picture": {

"thumbnail":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/thumbnail/public?itok=zcW4M1JC",

"medium":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/medium/public?itok=2poEpR1K",

"large":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/sites/default/files/styles/large/public?itok=OcLV1fUn" },

"created_time": "1463050948",

"updated_time": "1463050948",

"status": "0",

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1",

"type": "profile",

"location": {},

"connection_request_count": "0",

"events": {

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/events",

"data": []

},

"account": {

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/account",

"data": [{"id": "d76c0eda-2525-43d0-b9bf-69fe84860cfc",

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/accounts/d76c0eda-2525-43d0-b9bf-69fe84860cfc",

"type": "account"

}]},

"works": {“href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/works",

"data": []

},  "current_campus": {

"href":”http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/current_campus",

"data": []},

"Educations":

{"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/educations",

"data": []

},

"reading_list": {

    "href": "http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/reading_list",

    "data": []

  },

  "skills": {

    "href": "http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/skills",

    "data": []

},

"connections": {"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/connections",

"data": []

},

"connections_mutual": {“href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/connections_mutual",

"data": []

},"followers": {"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/followers",

"data": []

},"profiles_following": {"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/profiles_following",

"data": []},

"topics_subscribed": {"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/topics_subscribed",

"data": [

{

"rel_created_time": "1463050948",

"id": "74501461-5f98-407a-b7c0-90846058accb",

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/topics/74501461-5f98-407a-b7c0-90846058accb",

"type": "topic"}]},

"stories": {

"href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/stories",

"data": []

},"calendar":{

“href":"http://ec2-54-238-127-209.ap-northeast-1.compute.amazonaws.com/v1/profiles/438b5097-57c9-411f-9e4f-abd04192b2d1/calendar",

”data": []}}

 

Image of newly created profile retrieval.

Postman Screenshot

Your First Step to Git

Hey! So you are here in this page trying to find/learn something about git! Have you used a source code management system to synchronize your local code remotely before? Do you know that Git is the most powerful SCM. I was convinced and yes it is!

I have actually started SCM with svn( Apache Subversion). In fact started with TortoiseSVN, a GUI tool for Windows. Here there are no commands, no need to remember, so, nothing to worry, Just right click on your web root folder and choose whichever option you need! Sounds easy?

If you want to go with SVN, you can refer these links.
http://www.tutorialspoint.com/svn/svn_basic_concepts.htm
 

However I would suggest you to try Git!  You can ask me why? Well, It’s New! It’s Challenging! And you definitely enjoy it! You can simply follow this to understand working with Git.

Every Git working directory is a full-fledged repository with complete history and full revision tracking capabilities, which is not dependent on network access or a central server.

Git can be really confusing at first when working decentralized. Many questions would be running in your head, like,’ how to start with it?’, ‘how to properly set up the initial repository?’ etc. Assume you do not have the internet and you had some changes that are to be committed, now if its svn, you have to literally copy/paste. But if its git, you can still commit your changes locally using git commit and then push your changes to master using git push origin master. Unlike SVN, yes, Git adds complexity. You heard me right. Git commands like commit vs push, checkout vs clone, all you have to know is which command works locally and which works with server. Still Git seems to be the cool thing. It’s faster than svn.


Git is distributed, SVN is not:

Git like SVN do have centralized repository or server. But, Git is more intended to be used in distributed mode which means, we can actually checkout the code from central repository to our own cloned repository on our machine. Once you made your changes and you do not have internet connection, you will still be able to commit your changes locally, or create a branch locally and commit in that branch. Once your net connection is back, you can then push your changes to the server.

So what do you mean by branch system? Let’s say, I have a project and I want to subdivide my project in 3 parts - design, development & configuration and also assign to 3 different

people(p1,p2,p3). I will create 3 new branches for each person, other than the master branch. Let me take you through the workflow.


To create a branch:
git checkout -b branch1 master  -> creates a branch locally with name branch1 as a child of master branch., similarly will create 2 more branches branch2, branch3. Myself p1, had made some changes in my branch branch1.

Git repositories usually are much smaller than Subversions.  If you want a dump of code that is already present in Git, you can get the code repository using git clone.
git clone
You can find the link in the repository page in Github. Make sure you add the link with ssh protocol, instead of https which would require basic authentication(username and password) but ssh tries to use the ssh key.

If your project is all new and you want to share this, you will have to initialize the directory you are working on. Go to the directory and type the following:
git init
This initialises your local repository as git repository and now you will be able to commit the local changes to git.Verify the files which are changed. Do git status for this.

Check the difference of what is added or deleted in each changed file.


git diff -> shows diff of all the changed files.
git diff ”  -> shows diff of specified file

To commit:
git commit -am "commit message"
am - adds and commits the files with message.


To Add the files:
git add .

To add a single file
git add


To Commit the added files.
git commit -m "commit message"


You want to remove a file/folder that is already added/committed:
git rm -f


You want to modify the committed message:
git commit --amend -m "New commit message"


This changes the message of last commit, only if its not yet pushed.

You have committed the changes locally. Now you want to sync all the branches with same code changes. For this you have to push your changes to the server. But before pushing, I want to sync my branch as of master.
git pull remotename branchname


Eg: git pull origin master -> pulls the master code. Or You can fetch and then merge.
git fetch
git merge origin master

In this case, we might sometimes get a conflict if there are changes done for same file, on same line number in master and in local. In this case you have to make the changes required to avoid the conflict and commit the files again(git commit -am)

Now, push the local changes:
git push origin branch1

Done!! Simple! These are the basic beginner commands you would need to push your changes.
You wanna play more with git?
Git asks for user credentials each time you try to push your changes or while requesting pull. To avoid this, as said earlier you will have to add the ssh key to your git account.

To check the git user's info.
git config -l

Here are the steps to add ssh key.

Step:1
In Terminal, Paste the text below, substituting in your GitHub email address.
ssh-keygen -t rsa -b 4096 -C "email_id@example.com"
This Creates a new ssh key, using the provided email as a label


Step:2
When you are prompted to "Enter a file in which to save the key," press Enter. This accepts the default file location(/Users/you/.ssh/id_rsa).

Step:3
Enter passphrase: [Type a passphrase or click enter if you want it as empty]
Enter same passphrase again: [Type passphrase again]
Now the key is generated.


Step:4
Ensure ssh-agent is enabled:
eval "$(ssh-agent -s)"


Step:5
Add your SSH key to the ssh-agent.
ssh-add ~/.ssh/id_rsa


Step:6

  1. Adding the public key to your git account.
  2. Open /home/you/.ssh/id_rsa.pub copy the key to clipboard
  3. Login to github, Click the profile photo on top right corner, then click on "Settings"
  4. In the user settings sidebar, click SSH and GPG keys.
  5. Click New SSH Key.
  6. In the title add a descriptive label for the new key, Paste your new key in the "key" field.
  7. Click Add SSH Key

    Done! You will not be irritated by asking credentials anymore.

Sometimes you will be asked to perform one more step, incase there is a maintainer who merges all the branches. Login to git, open the new branch created and create a pull request, So that other branches can pull these changes. -> click on Create Pull Request.


This is the basic flow you would initially need to commit your local changes to server. Few of other commands that you need are:
You have modified a file and now you don't need those changes anymore,
git checkout resets the file to the last committed version.
Multiple files can be separated by space.

To delete a branch locally
git branch -D


You don’t need any of the changed files, and want to revert to last commit:
git reset --hard

To reset to a particular commit, use following
git reset --hard
eg: git reset --hard 1db6d9af7b6e3b1ebb7e9912ee514e9b50f85af1


To reset to last merge,
git reset --merge ORIG_HEAD


Let’s say you have made some changes to few of the committed files. Now, If you want to save the changes temporarily for single file, may be you will do a copy/paste, but if it's more than one file, it's annoying to copy/paste all the changes.
git stash
This reverts all the files to the last committed version saving the changes temporarily.

You want to get back the saved changes,
git stash apply


You want to ignore a file while committing, instead of adding each file, ignoring a single file from the list of changed files would look more better. Also there is no need to avoid this each time. So, for this you have to navigate to the location of your git repository, create a file .gitignore file and list out the the files or directory.
Example: I want to ignore my settings.php file, and a directory test. For this, add the files like below.


mysitefolder/sites/default/settings.php
mysitefolder/sites/default/files/test/*

...
..

You can actually commit this file if you want to ignore any changes blindly,and  to share with any other users when they clone the repository.


If you want to do ignore a file locally,
git update-index --assume-unchanged settings.php


To get the last commits:
git log
To get commit details of last n commits.
git log -n

Oh My God!! Too many! Looking Kind of complex? But trust me, Git may be harder to learn, but once you do learn how to use Git, you’ll find it to be feature rich and functional.  Git, Do have the GUI as well if you wanna give it a try, like smart git, source tree etc. Of course, Subversion's UI is more mature than Git's. I feel like git is even more reliable. No limits to explore! So you can still go around the web to find more interesting things about Git!

 

Create Apache2 Virtual Host using Shell Script

The ability to create and utilize tools, makes human race dominant in world. Tools make our work easier and also saves time. One of the tools, I am going to share is bash shell script to create apache2 server virtual host.

Why Virtual Host?

Using virtual host we can run more than one web site (such as dev.drupal-cms.com, stage.drupal-cms.com and www.drupal-cms.com) on a single machine. It can be "IP-based" or “name-based”. In IP-based you can have different IP address for each web site. In name-based you can have multiple names running on each IP address.

Shell script code
 

Explanation

Script expects 3 strings as input:

  1. Domain name: Name of domain you wish to give for the site. Eg: drupal-cms.local or stage.drupal-cms.com or drupal-cms
  2. Full path to webroot: Full path to site webroot. Eg: /var/www/html/drupal-cms
  3. Server admin (optional): Site server admin email id. This is optional, default value will be ‘webmaster@localhost’

Script does the following to create a virtual host for apache2:

  1. Creates virtual host rules files inside `/etc/apache2/sites-available/` (line number 12 to 24)
  2. Creates an IP address mapping in ‘/etc/hosts’ file. Mapping would be like this 127.0.0.1 $name (line number 26)
  3. Enables the site, a2ensite $name
  4. Restarts apache2 server, service apache2 reload

Usage

  1. Download or clone the script from https://github.com/manoj-apare/Virtual-Host-Script
  2. Make it executable.
  3. Run the command: sudo [path-to-script]/virtual-host-script.sh [domain-name] [full-path-to-webroot] [optional-server-admin-email-id]


Note: third argument is optional.

CLI


That’s it. Now you can access the site using newly created virtual host by clicking the link printed by the script on CLI.

How To Create Custom SOLR Search With Autocomplete In Drupal 7

In many cases, users visiting a site already know what they are looking for, hence they  head straight to the search box. Since it is likely to be their first point of contact with the website, retailers must ensure that they get it right the first time. This is to avoid the issue of users getting frustrated by inaccurate or badly-ranked results and as a result moving on to a different site.

A good starting point is to introduce a ‘suggest’(Autocomplete) function, which lists a drop-down menu of various  search queries containing the text fragment they have typed.

We will integrate the apache solr with our drupal site and make autocomplete search.
For this, we need to query the apache solr and then get the results from the apache finally  displaying it in autocomplete. With this we can customize our autocomplete search result.

Modules Required

  • Apache solr search
  • Apache solr Autocomplete

To Install and configure apache solr with drupal 7 please follow this

http://valuebound.com/resources/blog/installing-configuring-apache-solr-520-with-drupal-7-using-search-api-ubuntu-1404
 

Step 1:
Create the block to place search text field

/**
 * Implements hook_block_info().
 */
function solr_search_block_info() {
  $blocks = array();
  $blocks['solr_search_block'] = array(
    'info' => t('Place the Solr Search Block in any region'),
  );
 
  return $blocks;
}

/**
 * Implements hook_block_view().
 */
function solr_search_block_view($delta = '') {
  $block = array();
  switch ($delta) {
    case 'solr_search_block':
      $block['subject'] = '';
      $block['content'] = drupal_get_form('solr_search_block_form');
      break;
  }
  return $block;
}

Step 2:
Create the custom form with text field and make the text field autocomplete. Then place the following form in the block

 function solr_search_block_form($form, &$form_state) {
$form[‘search’] = array(
        '#type' => 'textfield',
        '#id' => 'edit-custom-search-block-id',
        '#autocomplete_path' => 'solr-search/autocomplete',
        '#attributes' => array('placeholder' => t('Search any thing'), 'class' => array("edit-custom-search-block"))
      );

  $form['submit'] = array(
    '#type' => 'submit',
    '#value' => 'Search',
  );
return $form;
}


/*
 * Implementing hook_menu()
 */
function solr_search_menu() {
  $items = array();

  $items['solr-search/autocomplete'] = array(
    'page callback' => 'search_autocomplete',
    'access callback' => TRUE,
    'type' => MENU_CALLBACK
  );
  return $items;
}

Step 3:

Let’s look at the callback function of the hook_menu which returns the autocomplete result.

function search_autocomplete($keyword = '') {
  global $base_url;
 
  $search_spell = _get_search_label_spellcheck($keyword);

  if (isset($search_spell) && !empty($search_spell)) {
    //$search_spell_link = l(t($search_spell), $base_url.'/search/site/'.$search_spell);
    $search_spell_link = l(t($search_spell), $base_url . '/search/site/' . $search_spell, array('attributes' => array('class' => array('search-custom-spellcheck')), 'html' => TRUE));
  }

  /**
 * Helper function that suggests ways to complete partial words.
 *
 * For example, if $keys = "learn", this might return suggestions like:
 *    learn, learning, learner, learnability.
 * The suggested terms are returned in order of frequency (most frequent first).
 *
   */


  $suggestions = array();
  $suggestions = array_merge($suggestions, apachesolr_autocomplete_suggest_word_completion($keyword, 5));
  if (apachesolr_autocomplete_variable_get_suggest_keywords() || apachesolr_autocomplete_variable_get_suggest_spellcheck()) {
    $suggestions = array_merge($suggestions, apachesolr_autocomplete_suggest_additional_term($keyword, 5));
  }
  if ($suggestions) {
    foreach ($suggestions as $key => $suggestion) {
      $spell = substr($key, 1);
      $search = _get_search_complete_keywords($spell);
      if (!empty($search)) {
        foreach ($search->response->docs as $sugg) {
          $node_id = $sugg->entity_id;
            if (!empty($node_id)) {
              $node_detail = node_load($node_id);
              $title = $node_detail->title;
              $matches[$title] = _get_search_autocomplete_list_display($sugg, $title);
            }
        }
      }
    }
  }

  if (!empty($matches)) {
    $url = $_GET['q'];
    $url_explode = explode('/', $url);
    $url_last = end($url_explode);
    //    $matches['more'] = l('VIEW ALL PRODUCTS', "$base_url/search/site/$url_last", array('attributes' => array('class' => array('search-more-autocomplete'))));
  }
  else {
    if (!empty($search_spell_link)) {
      $spellcheck_html = '

Finding for"' . $search_spell_link . '"?
';
      $matches[$keyword] = $spellcheck_html;
    }
    $matches[""] = "NO RESULT FOUND";
  }
  drupal_json_output($matches);
}


function _get_search_label_spellcheck($keys) {
  if ($keys) {
    $keys = preg_replace('/[^A-Za-z\-]/', '', $keys);
// Ask Solr to return facets from the 'spell' field to use as suggestions.
    $params = apachesolr_autocomplete_basic_params($suggestions_to_return);
    if (!empty($keys)) {
 //Helper function to get suggestions from Solr.
      $result = apachesolr_autocomplete_suggest($keys, $params, $keys);
    }
    if (!empty($result) && isset($result['response']->spellcheck)) {
      foreach ($result['response']->spellcheck->suggestions as $key => $check) {
        $spell[$key] = $check->suggestion[0];
      }
    };
    return $spell[$key];
  }
}

Step 4:
Next, we need to query the apache solr based on the text entered in the text field. This  will return the result from the solr.

function _get_search_complete_keywords($keyword) {
  if (!empty($keyword)) {
    $solr = apachesolr_get_solr();
    $query = apachesolr_drupal_query("custom", array('q' => $keyword));
    $query->addParam('rows', '1000'); // How many rows of result to display default it is 10.
    $query->addParam('qf', 'label'); // Only search in title
    //The bundle which you want to search
    $query->addFilter("bundle", "article");
    $query->setSolrsort('sort_label', 'asc');
    $resp_search = $query->search();
    return $resp_search;
  }
}

Step 5:
To display the autocomplete results in the textbox enter the following code:

function _get_search_autocomplete_list_display($sugg, $title) {
  global $base_url;
  if (!empty($sugg)) {
    $nid = $sugg->entity_id;
  }
  $n_link = $base_url . '/node/' . $nid;
  $title_link = l(t($title), $n_link, array('attributes' => array('class' => array('search-title-autocomplete'), 'title' => $title)));
  $data = '

' . $title_link .
      '
';

  return $data;
}

There you go! Now with autocomplete search allowing you the opportunity to tweak your content which is present in the article content type more users will find the information they need on your site.

Profiling Drupal Performance with Webgrind and Xdebug

Xdebug Profiling is all about measuring the performance of PHP code.


Here we go!

Requirements:

  1. Xdebug, with profiler enabled
  2. Webgrind
  3. Xdebug Addon plugin for browser

1. Xdebug with profiler enabled

For setting up the environment, edit the php.ini file and add following lines.

xdebug.profiler_enable_trigger = 1

If you want the cachegrind in a prefered location, then add the following

xdebug.profiler_output_dir="/var/www/html/xdebug_profiler"

If you want a prefered name, add below line of code

xdebug.profiler_output_name="cachegrind.out.%u.%H_%R"

Once added, restart the web server in my case

service apache2 restart

Once you restart the server, visit the index page where you have PHPInfo with profiler enabled, as shown in the image.
 

profiler-settings


2. Webgrind
For installing the webgrind, download the latest version from the below url

https://code.google.com/archive/p/webgrind/downloads

Extract it and put in the root folder of your web server. When you hit the localhost you will be able to see the running webgrind. In my case I have the existing cache grinds

webgrind


3. Xdebug Addon plugin

Next we go to installing the xdebug plugin. First download and enable the plugin,

xdebug

Once done, you will be able to see the icons shown in the below image on your browser.

xdebug2

Enable the Xdebug trace as shown in the image.



Now let's check the performance.

To do this initially we write a test code in a testme.php We have 3 for loops in our case, which runs heavily when compared to the rest.The 3rd one will be slowest one in our case,So this should be shown by our webgrind

 testme0

 

Run the testme.php code by hitting the url with the parameter XDEBUG_PROFILE=on

testme1

Now visit the webgrind at you localhost and select the testme.php‘s cache grind.
You will be able to see the output as shown in image

testme3


As shown in the above image


In line 24 of testme.php the time consumed is 99.84 percent of you entire testme.php time execution.


So in Drupal case also hit the page url as shown below where we are hitting the node/4 with XDEBUG_PROFILE=on

drupaltest1


Visit the webgrind on localhost, where one more cache grind will be created for drupal one.
 

drupaltest2

Similarly as in the test.php case, we can see at which particular line the code is taking the more time.

 

Conclusion


In this manner we can check for the specific piece of code which is taking more time to execute and optimize that particular piece of code.

Additionally, you can see the hierarchy of all function calls and follow the same steps to check for the root function performance.

There you have it!

How to create Custom Rest Resources for POST methods in Drupal 8

One of the biggest changes in Drupal 8 is the integration of Rest services in the core. With use of Views it become very easy to create RESTful Services.

But there are certain situations when you need to create your own custom REST Resources. There are documentations available for creating a GET request using Views and using custom module. However there isn’t much documentation available for creating Rest Resources for POST methods.

In this article I will be sharing, how to create a REST Resource for POST methods. This Rest API will create an article in Drupal 8 site from an external application.

Note: I will be using Drupal Console for generating the module and code boilerplates.

Create the module
drupal generate:module

module

 

Create the Rest Resource Plugin
drupal generate:plugin:rest:resource

rest

 

The above command will create a Plugin for rest resource in your module. The url /api/custom is the url for your resource which can be accessed like localhost:8000/api/custom?_format=json

Enable the resource from RestUI
 

rest UI


This was the easiest part.  Now we need to modify the Plugin created by Drupal Console to get the Resource working for POST methods.

In the above file change the Annotation lines

*  uri_paths = { 
*    "canonical" = "//api/custom" 
*  }

To

*   uri_paths = {
*     "canonical" = "//api/custom”,
*     "drupal.org/link-relations/create" = "//api/custom"
*   }

Otherwise your API will be be expecting the request to use the /entity/{entity_type} endpoint, which will conflict with the endpoint provided by core.

Next we need a serializer class for normalising the data being passed.
Add serialization_class = "Drupal\node\Entity\Node", in the Annotation for the Resource.
This will ensure that the data being passed is of entity type Node.

To run the resource use a Rest Client for example Advanced Rest Client.

advanced Rest Client

Now to create the node from the Rest Client change the create() function in your Resource class.

All done! This, is how we can create a node using Rest Resource for POST methods in Drupal 8. Here is the complete code for the Resource Plugin file:

Boost your Drupal development with Docker

Vagrant is a great virtualisation tool, which I prefer heavily for my development purposes. But sometimes it gets a bit hectic and resource consuming, to set up a new vagrant environment to work trivial things or testing out a module/API. 

Not being a great fan of local *AMP stack I was looking for some alternative to Vagrant to use. In comes Docker, which is super fast and very easy to setup. Containers (“virtual machines”) are easy to destroy and  rebuild.They do not require the overhead of virtual machines but they  still provide a high level of isolation from the host OS.

Docker hub have many Docker containers for Drupal which are ready to use . But I prefered to create my own Docker container which just works and runs Drupal smoothly.

  1. Create a directory anywhere in your system. I have it placed in ~/Documents/docker/drupal. (“drupal” is my Drupal root folder)
  2. Create a file called Dockerfile and paste the below code:
  3. What it does? This will install all the necessary packages required for running Drupal. Note: There is no MySQL package installed.
  4. Once the Dockerfile is placed, create the Docker image out from your Dockerfile.
  5. Run docker build -t drupal . This will take a  few minutes, which will create an image named “drupal”.
  6. Now we need to have a container for MySQL. The reason for having a separate container for MySQL is to make my Drupal container faster.
  7. Run docker run -p 3308:3306 --name drupal-mysql -e MYSQL_ROOT_PASSWORD=root -e MYSQL_DATABASE=drupal -d mysql:5.5 This will again take few minutes to download the MySQL image. Once it’s downloaded, from next time the command will be executed in milliseconds. NOTE: You need to give the MySQL container IP as host and 3308 as MySQL Port.
  8. Run the Drupal container from the image created in Step 4.
  9. docker run --name drupal8 -p 8080:80 -p 8028:22 --link drupal-mysql:mysql -v ~/docker/drupal8/drupal:/var/www/html -d drupal This will forward the Apache port 80 of the host system to 8080 port for the container. Also notice we are mapping our MySQL Container to the Drupal container's MySQL.
     

That’s it. Now if you do a docker ps you will see the running docker containers.

Docker ps

Now, go to http://localhost:8080 and you will see the Drupal installation screen, from where it is plain Drupal installation.

Brownie Points:

  • When your work is done: docker stop drupal-mysql && docker stop drupal8
  • Start working again: docker start drupal-mysql && docker start drupal8
  • Destroy the environment and rebuild: 
    • docker rm drupal-mysql, then `docker run -p 3308:3306 --name drupal-mysql -e MYSQL_ROOT_PASSWORD=root -e MYSQL_DATABASE=drupal -d mysql:5.5`
    • docker rm drupal8, then  `docker run --name drupal8 -p 8080:80 -p 8028:22 -p 3038:3036 -v ~/docker/drupal8/drupal:/var/www/html -d drupal8`

How to define your own Services in Drupal 8

Service  is a PHP class with some code that provides a single specific functionality throughout the application. So you can easily access each service and use its functionality wherever you need it. Because of that it is easy to test and configure in your application. This is called service-oriented architecture which is not unique to Symfony or even PHP.  

The Services and Dependency Injection Container concepts have been adopted by Drupal from the Symfony framework.  Accessing the database, sending email, or translating user interface are examples for the services in Drupal 8. 

Lets look at how to define your own service in drupal 8 custom module Development?

 

Step 1:
Create the .info.yml file [custom_services_example.info.yml]

Step 2:
Create the ‘mymodulename.services.yml’ file [custom_services_example.services.yml]

Here the file name is ‘custom_services_example.services.yml’ where the ‘custom_services_example’ is our module name.

custom_services_example.say_hello’ is the service name defined by us, , Where we need to follow the pattern of ‘module_name’ concatenate with a ‘unique_name’.

We have the class name for services. ‘class: Drupal\custom_services_example\HelloServices’ which will be kept under the ‘src’ folder.

Also the dependency can be added the  following way 
arguments: ['@modulename.services1', '@modulename.services4', '@modulename.services7']  
In this case there are no  dependencies.

For the detailed explanation on  the structure of the .services.yml file please visit https://www.drupal.org/node/2194463

Step 3:
Create the class ‘HelloServices.php’ under the ‘src’ folder

This is simple class provide the service.

How to access the our own defined service?

Accessing the service globally. 

$service = \Drupal::service('custom_services_example.say_hello');

If you want to test this , you can enable the devel module and go to the path ‘devel/php’ and  run the following code 

 

$service = \Drupal::service('custom_services_example.say_hello');
dsm($service);
dsm($service->sayHello('rakesh'));

So you will get the following output.

 

  • Drupal\custom_services_example\HelloServices Object
    (   
          [say_something:protected] => Hello World!   
          [_serviceId] => custom_services_example.say_hello
    )
     
  • Hello rakesh!
     


https://github.com/rakeshjames/custom_services_example

For  more details please visit  the following links

  1. https://www.drupal.org/node/2133171
  2. http://symfony.com/doc/current/book/service_container.html
     
Download the Drupal Guide
Enter your email address to receive the guide.
get in touch