Rabu, 29 April 2009

Hackintosh @ Accer Aspire 4920

Days ago, I tried to install OSX86 on the Mac Laptop Acer Aspire 4920 kesayanganku. Maybe not? In fact it is possible. Next Spesifikasi Acer Aspire I have:

* Intel ® Centrino ® Duo mobile processor technology, featuring Intel ® Core 2 Duo Processor T5550 (1.83GHz, 667FSB, 2MB L2 Cache), Intel ® GM965 Express chipset
* 3GB DDR II RAM
* Intel ® GM965 Express Chipset with integrated 3D graphics, featuring Intel ® Graphic Media Accelerator (GMA) X3100 with up to 358 MB of Intel ® Dynamic Video Memory Technology 4.0
* Intel ® Wireless WiFi Link 4965AGN (dual-band quad-mode802.11a/b/g/Draft-N) network connection, supporting Acer SignalUp with InviLink ™ ™ ™ wireless technology Nplify

I use iATKOS. With iATKOS 5i version (10.5.5). I once tried to install the version 4i. Consequently, even failure. : (. There are some other distribution which may be can you try like XxX, iDeneb, Leo4All etc.. Kostumasi By using the installation. Finally, I successfully use Hackintosh
search

view display

Being a Dell Mini Notebook Hackintosh

If the lower price Apple notebook newest, then the consumer may be interested in the offer price from the Hackintosh. Uneasy Silence reports that the system successfully install Leopard on the mini laptop Dell Mini 9, which are bandrol price USD 450. According to the Uneasy Silence, although the price must pay USD 129 for the license when Leopard is installed in the Dell, but this is not a big problem.

Unlike the Hackintosh, and the user can install Leopard on non Apple hardware. OSX86 Project wiki offers compatibility list where users can find out where the hardware has been dites and can work with Apple software. Install Leopard on Dell, Vaio, Eee PC or other laptop with Windows systems is not easy. Make sure the laptop can be used is compatible with Leopard. But before the user install, at least, have backup disk in WinXP, to anticipate if the user want to restore the original system, to install Windows.

Hackintosh notebook also has other bad effects, such as can not be upgraded to be normal, and no warranty kompetibilitas hardware, such as sound playback and wireless connections may be making life a little. If users want to play a little with the Leopard system or other installation, you can try Hackintosh. However, if users want a reliable OS X, then the user can purchase an Apple notebook. Want to try on the Inspiron Mini 9? Users can use the instructions from the Wiki OSX86 than that reported by Uneasy Silence.

Senin, 27 April 2009

Leo4all V3 pada GA-M55SLI S4

Spesifikasi Yang Digunakan :

MotherBoard : GA-M55SLI S4
Memory : Patriot 667 Mhz
Harddisk : Western Digital 160 Gb
VGA : Power Color Ati Radeon HD3650
kernel : 9.2.2 by Modbin

Driver yang berhasil : VGA , LAN (Realtek phy 81116),Sound (Alc 850),sleep
Problem : 1.Pada saat boot ketik cpus=1, jika tidak maka pada selang beberapa menit mouse akan bergerak lambat dan suara akan terjadi noise.

An Introduction to virtualization

Introduction

Over the course of the past couple of months, the AnandTech IT team has been putting a lot of time into mapping out and clarifying the inner workings of several interesting forms of virtualization. The purpose of these articles was not primarily to cover "news" (however current the topic might be), but to create a sort of knowledge base on this subject at AnandTech, as we find many people interested in learning more about virtualization are met with a large amount of misinformation. Secondly, we believe a better knowledge on the subject will empower the people in control of their company's IT infrastructure and help them make the correct decisions for the job.

There's no denying that virtualization is changing company's server rooms all over the world. It is promoting both innovation and the preservation of aged applications that would otherwise not survive a migration to modern hardware platforms. Virtualization is completely redefining the rules of what can and cannot be done in a server environment, adding a versatility to it that is increasing with every new release. We believe that when making big changes to any existing system, the more information that is available to the people given that task, the better.

But what about desktop users?

While the above should make it clear why businesses are extremely interested in this technology, and why we have been digging so deep into its inner workings, we are also noticing an increased interest from the rest of our reader base, and have been flooded with questions about the how and why of all these different kinds of virtualization. Since we don't want to leave any interested people out in the cold, and the in-depth articles may seem a bit daunting to anyone looking to get an introduction, here is another article in our "virtualization series". In it, we will attempt to guide our readers through the different technologies and their actual uses, along with some interesting tidbits for regular desktop users.

"New" Virtualization vs. "Old" Virtualization

The recent buzz around the word "virtualization" may give anyone the impression that it is something relatively new. Nothing is further from the truth however, since virtualization has been an integral part of server and personal computing, almost from the very beginning. To keep using the single term "virtualization" for each of its countless branches and sprouted technologies does end up being quite confusing, so we'll try to shed some light on those.

How to Define Virtualization

To define it in a general sense, we could state that virtualization encompasses any technology - either software or hardware - that adds an extra layer of isolation or extra flexibility to a standard system. Typically, while increasing the amount of steps a job takes to complete, the slowdown is made up for with increased simplicity or flexibility for the part of the system affected. To clarify, the overall system complexity increases, in turn allowing the manipulation of certain subsystems to become a lot easier. In many cases, virtualization has been implemented to make a software developer's job a lot less aggravating.

Most modern day software has become dependent on this, making use of virtual memory for vastly simplified memory management, virtual disks to allow for partitioning and RAID arrays, sometimes even using pre-installed "virtual machines" (think of Java and .net) to allow for better software portability. In a sense, the entire point of an Operating System is to allow software a foolproof use of the computer's hardware, taking control of almost every bit of communication with the actual machinery, in an attempt to reduce complexity and increase stability for the software itself.

So if this is the general gist behind virtualization (and we can tell you it has been around for almost 50 years), what is this recent surge in popularity all about?

Baby Steps Leading to World-Class Innovations

Many "little" problems have called for companies like VMware and Microsoft to develop software throughout the years. As technology progresses, several hardware types become defunct and are no longer manufactured or supported. This is true for all hardware classes, from server systems to those old yet glorious video game systems that are collecting dust in the attic. Even though a certain architecture is abandoned by its manufacturers, existing software may still be of great (or perhaps sentimental) value to its owners. For that reason alone, virtualization software is used to emulate the abandoned architecture on a completely different type of machine.

A fairly recent example for this (besides the obvious video game system emulators) is found integrated into Apple's OS X: Rosetta. Using a form of real-time binary translation, it is able to change the behavior of applications written for the PowerPC architecture to match that of an x86-app. This allows a large amount of software that would normally have to be recompiled to survive an otherwise impossible change in hardware platforms, at the cost of some of its performance.

Hardware platforms have not been the only ones to change, however, and the changes in both desktop and server operating systems might force a company to run older versions of the OS (or even a completely different one) to allow the use of software coping with compatibility issues. Likewise, developers have a need for securely isolated environments to be able to test their software, without having to compromise their own system.

The market met these demands with products like Microsoft's Virtual PC and VMware Workstation. Generally, these solutions offer no emulation of a defunct platform, but rather an isolated environment of the same architecture as the host system. However, exceptions do exist (Virtual PC for the Mac OS emulated the x86 architecture on a PowerPC CPU, allowing virtual machines to run Windows).

Putting together the results of these methods has lead to a solution for the problem quietly growing in many a company's server room. While the development of faster and more reliable hardware was kicked up a notch, a lot of the actual server software lagged behind, unable to make proper use of the enormous quantity of resources suddenly available to it. Companies were left with irreplaceable but badly aging hardware, or brand new servers that suffered from a very inefficient resource-usage.

A new question emerged: Would it be possible to consolidate multiple servers onto a single powerful hardware system? The industry's collective answer: "Yes it is, and ours is the best way to do it."