Check Availability
or call 908.996.4999

What Is a Private Equity Firm?

A private equity company is an investment company that raises funds to help companies grow by purchasing stakes. This is different than individual investors who buy stock in publicly traded firms, which important source gives them dividends but does not give them any direct control over the company’s operations and decisions. Private equity firms invest in groups of companies, referred to as portfolios, and are looking to control of these businesses.

They will often find a business that has room for improvement and buy it, making adjustments to increase efficiency, reduce costs and help the business grow. Private equity firms might use debt to buy and take over businesses this is referred to as leveraged buying. They then sell the company for a profit and collect management fees from the companies in their portfolio.

This cycle of buying, selling, and then reworking can be lengthy for smaller businesses. Many companies are looking for alternative methods of financing that can give them access to working capital without having the management fees of the PE firm added.

Private equity firms have fought against stereotypes portraying them as strippers, by highlighting their management expertise and successful transformations of portfolio companies. But some critics, including U.S. Senator Elizabeth Warren argues that private equity’s goal is to make quick profits that destroy the long-term perspective of workers and undermines their rights.