SUNY IT specific information =====Bits Project===== [[http://lab46.corning-cc.edu/user/afassett/start|Home]] | Project Bits ===System(s)=== ==Packages== Apache2 MySQL Client MySQL Server PHP 5 PHP 5 - GD PHP 5 - MySQL PHPMyAdmin ==Servers== [[http://www.lullabot.com/videos/install-local-web-server-ubuntu|Howto Video Ubuntu server]]\\ [[http://www.dokuwiki.org/install:ubuntu|doku-wiki server]]: untested/worked on.\\ [[http://drupal.org/|Drupal]]: a content management. [[http://www.dailymotion.com/video/x4gfqu_introduction-to-drupal_tech|HowTo:video]]\\ [[http://www.linuxhomenetworking.com/wiki/index.php/Quick_HOWTO_:_Ch08_:_Configuring_the_DHCP_Server]]\\ [[http://www.yolinux.com/TUTORIALS/DHCP-Server.html|yolinux dhcp]]\\ https://www.dyndns.com/account/services/hosts/add.html <- for dyndns setup\\ ==Routers== [[http://www.zebra.org/|Zebra]]:[[http://www.ibm.com/developerworks/linux/library/l-emu/|HowTo]]\\ http://www.linuxjournal.com/article/5826\\ ===Terms=== [[http://en.wikipedia.org/wiki/Round-robin_DNS|Round-robin DNS]] ===Notes=== [[http://www.aeronetworks.ca/howtos/|LINUX HOWto Lookup]]\\ =====Daily Reports===== ====Summer of 2012==== ===5/14/12=== **Minutes reports meeting**\\ gw reconfg vpn\\ c107 4pat vm\\ ccc vm openvpn\\ hadoop -> mpi3 -> ccc\\ __questions at the meet__\\ 1.) does hadoop know how too get local data? yes\\ 2.)can hadoop data applications be controlled? yes\\ ===5/15/12=== In the lab today, counting up and lining up machine for cluster. Router bits-optiplex gx240 first cluster - 6x dell dimension 4600\\ second cluster - 8x -optiplex gx240\\ third cluster - mixed and unready\\ fourth cluster - 6x dell dimension 4600\\ ===5/16-21/12=== Installed OS on machines. ===5/22-24/12=== Installed Open Java JDK on all the machines. ===5/28/12=== Network problem setting the machine dynamically.!\\ Tried statically setting ip to old ip address(192.168.202.X/24)\\ ===5/29/12=== Still having network problems... Met with joe to solve them, Joe met with Nick earlier that day.\\ ===5/30/12=== Still having network problems, configured the machine to have hadoop connected to the java directory(used the bitsgw for a template.) bitsgw has it configured to $HOME=/etc/java/jdk for example.. ===5/31/12=== bitsgw was setup to work along in hadoop now need to configure to work with the other machines... looking into configuration\\ http://hadoop.apache.org/common/docs/r1.0.3/cluster_setup.html\\ making a cluster. ===6/1-12/12=== Network fixed statically, hadoop configurations for NameNode/DataNode and JobTracker/TaskTracker. still confused on how these files are created or made looking for examples...\\ ===6/12-17/12=== http://opennebula.org/about:about\\ looking at the machines and ran the following command: egrep -c '(vmx|svm)' /proc/cpuinfo\\ seems that none of the dells in the lab work for open nebula...\\ ===6/18-25/12=== http://www.cyberciti.biz/faq/linux-xen-vmware-kvm-intel-vt-amd-v-support/\\ more support research for opennebula...\\ next project is setup for pxe server. Want to create a pxe server in order to have nodes for hadoop be created and used on the fly!\\ ===6/26/12-7/1/12=== http://web.cs.sunyit.edu/~tulowij/ together we were able to create something! ===7/2-7/6/12=== Moved Hadoop master node from BitGW to PC2... all other nodes are moved down one!\\ ===7/7-12/12=== Create user NICK on BITSGW, documentation:https://docs.google.com/document/d/1Rhql6Xb4XFUINxOcRB8bREwS_Fom8aSVYMR13baTyEw/edit\\ Also had a meeting in C221 on Project:BITS-included Matt, Nick, Scott, Eward, Joe, Me. ===7/12-19/12=== More and revise version of the Bits project documentation! Working more on the network documentation and Reconnecting the VPN connection with corning and sunyit. Had a meeting with Joe Wednesday 7/15/12. Organized Machines and separate the Hadoop machine from bits router. ===7/20-26/12=== looking at making a internal network for hadoop machines. ==research== http://www.funtoo.org/wiki/Funtoo_Linux_Networking ===7/27-7/30/12=== Setup for the machines is complete and vpn is up. The network setup with a intranet. ===8/1/12-8/10/12=== started working on a custom iso with hadoop already installed. ==resources== https://help.ubuntu.com/community/LiveCDCustomization ===8/10/12-8/16/12=== Meeting with Joe discussion on how his script with work with my custom iso. Joe edited the pxe server and did a refresh. Talked about generating the keys for the node in order to do a passwordless login. Custom iso is Ubuntu server 12.04 with bin/except needed.\\ ====Fall 2012 Semester==== ===9/20/12=== Researched services to accomplish our goal PXE, NETBOOT, NFS server.\\ Just to make clear what we want is a diskless networking boot of hadoop machines that can be created dynamically. X amount of nodes with a virtual master node and a virtual router w/ vpn setup to other routers. ==Resources== http://nfs.sourceforge.net/nfs-howto/index.html\\ http://netboot.sourceforge.net/cgi-bin/getdoc.cgi?doc=HOWTO.html&lang=english&source=index&title=Netboot%20HOWTO\\ http://en.wikipedia.org/wiki/Network_File_System\\ http://en.wikipedia.org/wiki/NetBoot\\ http://en.wikipedia.org/wiki/Preboot_Execution_Environment\\ ===10/20/12=== http://tldp.org/HOWTO/LDAP-HOWTO/whatisldap.html\\ http://www.openldap.org/\\ ===11/14/12=== "Matt's email" Here's on online LDAP book I have heard good things about: http://www.zytrax.com/books/ldap/ First testing step would be to set up a local server, put a few users in it, and configure a few clients to authenticate, and get that to work. When you're ready, I've always been particularly fond of these tutorials: http://www.rjsystems.nl/en/2100-d6-openldap-provider.php http://www.rjsystems.nl/en/2100-d6-openldap-consumer.php http://www.rjsystems.nl/en/2100-d6-openldap-client.php ===2/12/2013=== Have been working with OpenLdap on a local machine and written documentation in google drive.