Sunday, December 16, 2018

JavaScript Mutable and Immutable Objects



In JavaScript Primitives are immutable, which means that the value is passed instead of reference.
For Objects the reference is passed instead of value. Example as below,

//Primitives are immutable
var a = 10;
var b = a;

console.log("a: "+a);
console.log("b: "+b);

a =20;

console.log("a: "+a);
console.log("b: "+b);

//Mutable
var c = {name: "Zack"};

var d = c;
console.log("c: "+c.name);
console.log("d: "+d.name);
d.name = "Arnold";
console.log("c: "+c.name);
console.log("d: "+d.name);

Ensure whether you are passing value or reference during development. This is one of the basic concepts in JavaScript. 

JavaScript Objects Types


In JavaScript there are 3 ways you can create an Object. Examples as below,

var object1 =
{
    name: "Zack",
    age: 35
}

console.log("Object 1 Name: "+object1.name);

var object2 = new Object();
object2.name = "Zack";
object2.age = 35;

console.log("Object 2 Name: "+object2.name);

function Object3(name, age)
{
    this.name = name;
    this.age = age;
    this.getName =function () {
        return this.name;
    }
}
var object3 = new Object3("Zack", 35)
console.log("Object 3 Name: "+object3.getName());

object1.sex = "male";
object2.sex = "male";
Object3.sex = "male";// Value will be undefined for the Object
//Object3.prototype.sex = "male";

console.log("Object 1: "+ object1.sex);
console.log("Object 2: "+ object2.sex);
console.log("Object 3: "+ object3.sex);

The 3rd Type of Object is the most used type, since it is highly reusable. 

Tuesday, October 2, 2018


When installing Atom Packages some machine will receive a error as

failed: unable to get local issuer certificate (UNABLE_TO_GET_ISSUER_CERT_LOCALLY)

PS C:\Code\NodeJS Exercise> apm install split-diff
Installing split-diff to C:\Users\zack.dawood\.atom\packages failedRequest for package information failed: unable to get local issuer certificate (UNABLE_TO_GET_ISSUER_CERT_LOCALLY)



In order to fix this execute the following command

apm config set strict-ssl false

Now install the required package, it should work

apm install split-diff 


Sunday, February 19, 2017

VMWare Fusion Mac Power Maps Error Fix



VMWare Fusion Mac Power Maps Error Fix 

If you are using VMWare Fusion 7-8.5 and Running Window 10. When you use Power Maps / 3D Maps Features in Microsoft Office Excel 2013 / 2016 you will have a error as below,

---------------------------
3D Maps
---------------------------
Cannot initialize DirectX. Confirm that your hardware supports DirectX 10 or later and that you have the latest video card driver installed.
---------------------------
OK   
---------------------------




To Fix the problem Shut Down the Windows 10 Virtual Machine, Click the Settings and Select Display



Check the Accelerate 3D Graphics Settings




UnCheck the Accelerate 3D Graphics Settings



Start the Virtual Machine, Open the Excel, Insert, 3D Maps


Friday, July 3, 2015

Do It Yourself ToastMasters Timing Light

Do It Yourself ToastMasters Timing Light 

Green, Amber (Yellow), Red signals are commonly used in Toastmasters clubs to show speakers when they have reached minimum time, are halfway through their allotted time, and when they have reached maximum time. Many clubs use a set of colored cards–some Toastmasters even carry small colored cards in their wallets for impromptu Toastmasters meetings!
But cards have to be held up in the air and are not very eye-catching. Timing lights are preferred, where actual green, amber and red lights can be turned on and get the speaker’s attention. The official Toastmasters Signal Light costs $110 to $185 depends on the area you live in.


Toastmasters Signal Light costs $110 to $185

Toastmasters Signal Light costs $110 to $185

I wanted the same in battery operated without external power and for low cost. There are other do it yourself kits but that needs external power, hence i have decided to make my own. DO IT YOURSELF. 

Do It Yourself ToastMasters Timing Light 

I managed to do it yourself toastmasters timing light with less $50 including batteries












Parts: 

I live in Ontario, Canada. Bought it from A1 Electronics
http://www.a1parts.com/led/led.htm
http://www.a1parts.com/battery_holders/battery_holders.htm
http://www.a1parts.com/switches/index.html#ROTARY SWITCHES

1. Signal Light 30 LED`s - $7.50 x 3 = $22.50
2. Battery Holder - $2.80
3. Rotary Switch - $4.80
4. Incandescent indicators 12VDC - $2.50 x 3 = $7.50
5. Rotary Switch Knob - $0.99
6. Dollarama Wooden Box (Transparent Lid) - $2.00

Total Parts = $40.59


1. Signal Light 30 LED`s





Part Number MCD
DK4555-G  (Green) 18000
DK4555-Y  (Yellow) 7500
DK4555-R  (Red) 6500

12V  $7.50 / each

2. Battery Holder 



Part Number: 4200A
 Um3x8 for Eight "AA" Cells
Size: 60mm (L) x 29mm (W)x58mm (H)
$2.80


3. Rotary Switch


ITEM NO. DK6561 C
SECTIONS: 1
POLES: 3
POSTIONS: 4 (OFF, GREEN, YELLOW, RED)
$4.80

4. Incandescent indicators 12VDC




Incandescent indicators 12VDC

55-492(Red Lens)
55-493(Amber Lens)
55-495(Green Lens)

$2.50


5. Rotary Switch Knob





Part Number
54-374-BL
A mm -
B mm - 30
C mm - 15
Shaft mm - 6.35

Price - $0.99


6. Dollarama Wooden Box (Transparent Lid)




$2.00 Dollarama

7. Tools 


Hook - up Wires  (Stranded) (If you don't have $3)

Soldering Iron (If you don't have one you can get it for $5)



8 AA Batteries




Wednesday, July 16, 2014

Big Data Hadoop Hive SQL Query Hello World

Big Data Hadoop Hive SQL Query Hello World

Prerequisite
  • Big Data 
  • Hadoop 
  • SQL

If you are reading this blog you should know about Big Data and Hadoop.

Big Data is a technology revolution in the RDBMS world, however big data hadoop distributed file system can be written as a flat file with different formats like CSV, Tab Delimited etc.,

Also in order to process these data you need to be an expert in Java to write a Map Reduce program.

To make use of Big Data for non-Java users like Data Analysts, there is feature to Query the flat files using SQL has been introduced. This is Apache Hive https://hive.apache.org/

http://en.wikipedia.org/wiki/Apache_Hive

Hive was introduced by Facebook and now used by Netflix. It is a powerful querying tool in Big Data hadoop.

Basically Hive is capable of transforming your SQL queries into Map Reduce programs.


The following are the steps to be done

1. Create Hive Table with Meta data information
2. Load data into Hive Table ( 2 Types )
      a. Loading data in local file system to Hadoop & Hive
      b. Loading data in hadoop file system to Hive
3. Query the table


I have a Test Data as below, ( It has 2 fields ID & PHONE NAME)

1,iphone
2,blackberry
3,nokia
4,sony
5,samsung
6,htc
7,micromax


To get started you need have hive installed already and hadoop file system configured with Name Node, Job Tracker, Data Node, Task Tasker etc.,


Step 1:  Launch the hive console from the command line / terminal



Step 2:  Create the table with the 

CREATE TABLE PHONE ( ID INT, PHONE_NAME STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE;



Step 3:  Loading data in local file system to the table 

LOAD DATA LOCAL INPATH '/home/training/PHONE.txt' OVERWRITE INTO TABLE PHONE;


Step 4:  Query your table which is created and data loaded in Hive

select * from PHONE;



Well you should be good with local mode.


Let's have a quick peek at the Server Mode (Type 2). If you have to load data from Hadoop File to Hive, first we need to send the file from local file system to hadoop file system.


Step 1: Place the file from local file system to HDFS (Hadoop Distributed File System)

hadoop fs -put PHONE.txt



Step 2: Verify if the file has been placed in the HDFS

hadoop fs -ls PHONE*



Step 3:  Create the table with meta data information

CREATE TABLE PHONE_SERVER ( ID INT, PHONE_NAME STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE;



Step 4:  Load the data from HDFS to HIVE Table

LOAD DATA INPATH '/user/training/PHONE.txt' OVERWRITE INTO TABLE PHONE_SERVER;

Step 5: Verify by performing a SQL Query and check the results

select * from PHONE_SERVER;





Tuesday, July 15, 2014

Big Data Hadoop Hive Getting Max of a Count


This Example about a query to get Max of a count.

Pre-Requiste
- Basic Hadoop Knowledge
- Basic Hive Knowledge
- Basic SQL Knowledge


My Input Data is something like as below,

1,iphone,2000,abc
2,iphone,3000,abc1
3,nokia,4000,abc2
4,sony,5000,abc3
5,nokia,6000,abc4
6,iphone,7000,abc5
7,nokia,8500,abc6
Problem:
In Hive we can perform group by as we do in ANSI SQL Example as below,

select d.phnName,count(*) from phnDetails d group by d.phnName

The output of the above query is as below,

iphone 3
nokia 3
sony 1

You might have a scenario something like to retrieve only values for equal to Max

Example if you need output like as below,

iphone 3
nokia 3

Resolution:

We need to use multiple Sub Queries to perform this operation

select c.phnName, c.counter 
from 
(select d.phnName as phnName, count(*) as counter from phnDetails d group by d.phnName ) c 
join 
(select max(f.counter) as countmax from
(select cnt.phnName as phnName, count(*) as counter from phnDetails cnt group by cnt.phnName ) f) g 
where c.counter = g.countmax;

Output is as below,



Queries built in multiple iterations as below,

CREATE TABLE phnDetails ( id INT, phnName STRING, price INT, details STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE;

LOAD DATA LOCAL INPATH '/home/training/Phone/phones.txt' OVERWRITE INTO TABLE phnDetails;

select * from phnDetails;


select d.phnName, count(*) from phnDetails d group by d.phnName;


select c.phnName, c.counter from 
(select d.phnName as phnName, count(*) as counter from phnDetails d group by d.phnName ) c ;

select max(f.counter) as countmax from
(select cnt.phnName as phnName, count(*) as counter from phnDetails cnt group by cnt.phnName ) f ;

select max(f.counter) as countmax from
(select cnt.phnName as phnName, count(*) as counter from phnDetails cnt group by cnt.phnName ) f ;

select c.phnName, c.counter 
from 
(select d.phnName as phnName, count(*) as counter from phnDetails d group by d.phnName ) c 
join 
(select max(f.counter) as countmax from
(select cnt.phnName as phnName, count(*) as counter from phnDetails cnt group by cnt.phnName ) f) g 
where c.counter = g.countmax;