Who restarted my server / troubleshoot unplanned events

Although the Windows Server 2012 “auto update” was configured to only check and download the updates, there was an unplanned server reboot.

An issue also arises when multiple administrators are administering servers..  trying to blame each other who restarted/shutdown/applied updates.

Well, from event logs its easier to identify who rebooted server and also can add custom view to event diagnosis to filter unplanned events.

To identify who restarted server:

Goto event viewer, Windows Logs, Systems and choose “Filter Current Log..”  from actions pane, search for event ID# 1074

Create custom event to troubleshoot unplanned server reboot events:

follow this link (Create a Custom View in Event Viewer to show Reboot Events)

Advertisements

Cannot logon to Windows XP without activation beyond 30 days

Well, guys this is the most common, issue faced with system administrators who forgot to activate XP within 30 days of its installation.

In my case, the issue was P2V of Windows XP physical machine and due to drastic hardware architecture changes, Windows XP loses its activation and requires re-activation.
See my previous blog post

With the instructions i was successfully able to first configure network settings and then activated the OS.
Here is the brief step by step instructions:

  • Turn on your computer and login to your account. You get a message telling you that you need to activate your product, Click Yes and the Activate Windows screen will load.
  • Now press the windows key + U and Microsoft Narrator should pop up. Click OK for the little disclaimer Narrator gives you and behind it will be the Narrator option screen. In the top left of the Narrator option screen, you’ll see a little computer icon, click it and a drop-down menu will appear.
  • Click on About Narrator. On the next pop-up, click the link Microsoft Web site. You now have access to Internet Explorer!!! Whether you have an internet connection at the time or not, it doesn’t matter, we aren’t actually trying to get online.

Now we are going to access your desktop and Start bar. In the address bar, type: C:\WINDOWS\explorer.exe and press enter.

VOILA! You have access to your entire computer just like you normally would! Don’t close Narrator though, because windows tends to realize what you’re doing and reset back to the Activation screen. It’s not bullet-proof, sometimes my computer will reset back to the activation screen, but you can just do the same thing again as many times as you want!

Thanks to this internet article

XVA won’t start after successful import

I converted a physical machine (Windows XP) successfully & imported to my XenServer with XenConvert. The import appears to be successful. However, when I start the machine, all I get is a black console screen with the following info:

Citrix XenServer
http://www.citrix.com
cirrus-compatible VGA is detected

Processor 1: Xen(R) Virtual CPU
XS Virtual IDE Controller Hard Drive (73GB)
XS Virtual ATAPI-4 CD-Rom/DVD-Rom

Boot device: Hard Drive – success.

Some licensed legacy application was running on this machine and we needed to eliminate any physical PC in Datacenter.
Else there was no need to import XP rather would have built new one.
Well after repeated reboot no success with the VM, searching the forum someone suggested to fix boot sector with Windows XP repair disk.
The iso of existing XP-SP3 didn’t offer repair screen since the original install didn’t match the iso file.

SystemRescueCd-x86-2.5.1 iso came handy, booted from this gentoo based linux distro and fixed the boot sector with “ntfsfix”.
Disconnected the iso and voila the VM booted successfully.

So far only the vm booted successfully, but any geek would guess that due to HAL changes between the physical and virtual, XP ends up in re-activation process.
Without network the VM cannot connect to internet.. how will i fix this.
look out for next blog post… 🙂

10 things to look for in a data center

Everyone’s going to the cloud. The cloud’s all the rage. Almost no IT discussion is complete without mentioning “the cloud.” But when it comes down to it, the cloud is nothing more than systems hosting information in a data center somewhere “out there.”

Organizations have discovered the benefits of offloading infrastructure development, automatic failover engineering, and multiple coordinated power feeds, not to mention backups, OS maintenance, and physical security, to third-party data centers. That’s why “going to the cloud” ultimately makes sense.

Unfortunately, not every data center is ready for prime time. Some have sprung up as part of a cloud-based land grab. Review these 10 factors to ensure that your organization’s data center is up to the task.

1: Data capacity

2: Redundant power

3: Backup Internet

4: Automatic hardware failover

5: Access control

6: 24×7×365 support

7: Independent power

8: In-house break/fix service

9: Written SLAs


10: Financial stability

more here


Create Large VHD’s in seconds!

simul-post

The vhd Tool at MSDN by Chris Eck can create large VHD’s via command line in seconds. Wow! this is better than chocolate ice cream and just short of Spumoni. (And when the fat guy rates something in the range of ice cream you know it’s a good tool.

Here is what he says about it…

Latest News

The final release of v2 is now available.
I’ve added a “repair” function which is designed to undo an expand operation on a base VHD when differencing VHDs are present. This is useful in cases where an admin accidentally expands a base VHD when Hyper-V snapshots are present.
To ensure data integrity in the case of an error, please make a backup copy of your VHDs before altering them with this tool.

Introduction
VHD tool is an unmanaged code command-line tool which provides useful VHD manipulation functions including instant creation of large fixed-size VHDs. The source code is included.

Requirements
A computer running one of the following Windows operating systems:
Server: Windows Server 2003 or above
Client: Windows XP or above
NTFS file system

Usage
VhdTool.exe /create  [/quiet]
VhdTool.exe /convert  [/quiet]
VhdTool.exe /extend  [/quiet]
VhdTool.exe /repair  [/quiet]

Create: Creates a new fixed format VHD of size .
WARNING – this function is admin only and bypasses
file system security. The resulting VHD file will
contain data which currently exists on the physical disk.

Convert: Converts an existing RAW disk image file to a fixed-format VHD.
The existing file length, rounded up, will contain block data
A VHD footer is appended to the current end of file.

Extend: Extends an existing fixed format VHD to a larger size .
WARNING – this function is admin only and bypasses
file system security. The resulting VHD file will
contain data which currently exists on the physical disk.

Repair: Repairs a broken Hyper-V snapshot chain where an administrator
has expanded the size of the root VHD. The base VHD will be
returned to its original size. THIS MAY CAUSE DATA LOSS if the
contents of the base VHD were changed after expansion.

Known Issues
There are currently no known issues.

Frequently Asked Questions
How do I file a bug?
Click on the “Issue Tracker” tab and choose to “Create New Item”.

Examples
Create a new 10 GB fixed VHD in the current directory.
VhdTool.exe /create “c:\Program Files\MyApp\foo.vhd” 10737418240

Convert an existing flat image file into a VHD & do not output status to the command line.
VhdTool.exe /convert bar.img /quiet

Extend an existing fixed format VHD to a larger size.
VhdTool.exe /extend foo.vhd 21474836480

Repair a Hyper-V snapshot chain broken by expanding the base VHD.
VhdTool.exe /repair base.vhd base_EF2F9402-E85B-402F-A979-631CB287C2C4.avhd

original post

Myth or Fact: Virtualization Increases the Speed of Delivering IT Services

While the delivery of virtual machines is indisputably faster than deploying physical machines, it is often assumed that this also streamlines the process of deploying IT Services (applications). Virtualization can be used as an accelerator for building out highly dynamic cloud based services, however, the fact remains that while provisioning and deploying VMs has been greatly simplified, application deployments are still complicated and dependent on People, Processes and Technology.

Sure, in a perfect world where organizations only run one application on one type of OS, and one version of that application – virtualization can certainly automate the entire process of application delivery.  QA and development environments have been realizing this benefit for years.  However, what happens when organizations begin to accelerate their adoption and begin migrating mission critical, multi-tier applications to the virtual data center?

Unfortunately, we live in an IT service world that is comprised of ever increasing complexity requiring in-depth knowledge of application deployment, application dependencies, application monitoring and a deep understanding of security and regulatory requirements.  Combine this with the myth that applications deployed in the virtual data center can be delivered faster, cheaper, and easier, it is no surprise that some operations teams are struggling to meet these expectations.  How many times has the underlying virtual infrastructure (VMs) been provisioned in minutes, while the actual delivery of services takes hours, days, or in certain cases, weeks.

To the surprise of many, the deployment of IT services in a virtual world still requires the coordination and collaboration of teams spanning operations, security, storage, network, server and applications. At this point, I must ask: does virtualization add to the complexity of delivering applications by frequently adding another isolated ‘team’ to IT? If we measure solely the time it takes to deliver the IT service, shouldn’t we also measure how well the teams collaborate across the silos that are required to deliver those services?

Perhaps virtualization is making us virtually blind to the reality that legacy processes, technology and approaches are inadequate for delivering IT services in the cloud. Unless the virtual data center becomes the catalyst for the convergence and alignment of security, operations, application, server, storage and network teams towards common ‘goals and SLAs’, we will continue to live the reality that delivering cloud-ready services require a little more than work than just ‘right click – deploy.

Original post here

NexentaStor ZFS based SAN, console,web-gui error

I constantly have problems at access to nexentastor console and web-gui

NMV trouble:

After long time out i see:

**Proxy Error

The proxy server received an invalid response from an upstream server. The proxy server could not handle the request GET /data/services/.

Reason: Error reading from remote server

Apache/2.2.8 Ubuntu DAV/2 mod_ssl/2.2.8 OpenSSL/0.9.8k Server at x.x.x.x Port 2000

after restart the nms service (svcadm restart nms) i can now access the web-gui but still not thru i-explorer.

wondering what could be the issue, i added the host url to trusted site list but still unable to access the web-gui vie IE,

chrome, firefox & opera rocks…..

at times i pat myself… great buddy, were u not thinking of putting the SAN (ZFS based nexentastor) to production straight away after test running @home.

phew… its better to get your hands burnt a few times and tame, n get to know the pros n cons of any new technology before adopting it for production…