About
Feedback
Login
NEWS
ORGANIZATIONS
PROJECTS
PRODUCTS
1 to 35 of 35
James Farrar
Uber drivers in Europe and the U.S. are fighting for access to their personal data. Whoever wins the lawsuit could get to reframe the terms of the gig economy.

Over two years of driving for Uber, James Farrar logged thousands of miles on the app. Many weeks, he’d work more than 80 hours behind the wheel of his Ford Mondeo, crisscrossing the streets of London deep into the night. Along with passengers, Farrar was collecting data.

During his time in the car, Uber’s app recorded where he went, how long he stayed, how much money he made, and how many stars he was given by his passengers. It noted how many rides he accepted and how many he cancelled, mapped where trips started and ended, and how long it took him to wind through traffic to get there as he followed the algorithmic cues nudging him around the city.

Being an Uber driver, Farrar found, did not agree with him: “[L]ife behind the wheel,” as he wrote in a recent op-ed for the U.K. Independent, “can become a blur of endless traffic, crushing loneliness, enduring fatigue and relationships strained by absence.” And he grew frustrated by the way the app always seemed to be pushing him to accept more rides, while his earnings kept declining. In 2016, he and another London Uber driver, Yaseen Aslam, brought a worker’s rights claim against the company, arguing that drivers aren’t true independent contractors and should instead be classified under the U.K.’s third employment category of “worker”—entitling them to minimum wage and paid vacation days. Farrar’s team won the classification case.

But during the trials, Uber was able to use Farrar’s personal data as legal ammunition against him, he said; the company argued that the reason he made under minimum wage some days was because he’d declined several rides, not because he was being fleeced by the app. “I decided then that I needed to see all my data,” Farrar told CityLab in an email. “[S]o I could properly assert my rights and eliminate the asymmetry in information power between me and Uber.”

Uber appealed, arguing, as it long has, that it merely connects independent entrepreneurs with riders, and that a change in classification would impede drivers’ freedom. (A judge said that the wording of Uber’s contract, which makes the same claim, contained a “high degree of fiction.”) Still, one judge out of three backed the company, and Uber was given permission to bump the trial up to the U.K.’s Supreme Court.

With the support of Ravi Naik, the lawyer who is also representing plaintiffs in data privacy cases against Facebook and Cambridge Analytica, Farrar and three other drivers have bundled their data requests, eking out more information with each challenge. This March, they filed a lawsuit against Uber for withholding some data, which they say is in breach of the European Union’s General Data Protection Regulation (GDPR). This law gives E.U. citizens the right to request any and all personal data that a platform retains about them.

Though he says he’s not “anti-Uber,” Farrar’s labor rights activism has accelerated: In 2017, he helped found and became the chairman of the United Private Hire Drivers branch of the IWGB union, which represents more than a thousand workers for private hire companies. And as of this summer, he has pushed more than 60 other drivers to file similar data claims. Now he’s pooling their information online as part of an organization he founded and directs called the Worker Info Exchange. Since Uber arbitrates cases from all its worldwide markets besides North America in Amsterdam, an E.U. country, these GDPR-based claims could be replicated in dozens of countries.

With that aggregate, he wants to determine definitively how much (or little) drivers make for their time—and how an over-supply of temporary drivers has saturated the market with idling cars. “The negative effects of these apps are congestion and poverty, and we need the data to show that,” Farrar said.

This tension isn’t an entirely new one. Mounting evidence suggests that all kinds of apps, from the Weather Channel to the selfie-filtering Perfect365, have habitually scraped location data from phones and used it to better predict consumer buying habits. And Uber’s tight grip on information is part of a long pattern for the ride-hailing titan, which has often tangled with local regulators over its vast trove of trip data. But Farrar believes that whoever gains access to Uber’s complete data caches will find something bigger than just tools for better traffic
Alex Brandon/AP
A new study claims the effects of neighborhood change on original lower-income residents are largely positive, despite fears of spiking rents and displacement.

Anyone can see the changes at work in a gentrifying neighborhood. Rents rise, crime drops, wine bars bloom in vacant storefronts. But it’s harder to see what’s happening inside people’s homes and lives. For all the handwringing that accompanies gentrification—from how common it is to the meaning of the word to what people should do to stop it—there are rarely robust efforts to tease out the impact of a neighborhood’s economic upswing on its original residents. Few resources exist to show how change really affects residents, for good or bad.

The study looks at original residents of low-income, central-city neighborhoods of the 100 largest metro areas using census data from 2000 and American Community Survey data from 2010 to 2014. Using the earlier data as a base, researchers Quentin Brummet and Davin Reed tracked changes in educational achievement and household status among less-educated renters and homeowners as well as more-educated renters and homeowners. While some of these neighborhoods saw gentrification, not all did, providing a basis for comparison.

For less-educated renters, who are among a neighborhood’s more vulnerable demographic groups, gentrification drives out-migration by 6 percentage points. Migration among renters is high whether a neighborhood becomes fancy or not: The research finds that 68 percent of less-educated renters and 79 percent of of more-educated renters move over the course of a decade. So, on average, gentrification spurs around 10 percent of moves for less-educated renters (and much less so for renters with more education).

Given the high rate of change within neighborhoods, the data suggest that gentrification itself is overdetermined as a direct cause of displacement. “This effectively places a limit on the potential for gentrification to cause displacement and makes it possible for neighborhoods to change quickly even without strong displacement effects,” the paper reads.

No doubt, there are unobservable costs associated with moving, which the paper acknowledges. Moving is pretty awful under the best of circumstances, and “displacement” usually summons a worst-case scenario. But leaving a neighborhood can lead to a perfectly neutral outcome. The research shows that “for all types of individuals, movers from gentrifying neighborhoods do not experience worse changes in observable outcomes than movers from nongentrifying neighborhoods.” The paper continues, “That is, they are not more likely to end up in a higher-poverty neighborhood, to become unemployed, or to commute farther than individuals moving from nongentrifying neighborhoods.”

For those original renters and homeowners who stick around, the benefits of improving neighborhood conditions are several. Gentrification reduces the exposure of original residents to poverty, which is tied especially to healthy outcomes for children. For less-educated renters, gentrification appears to be absolutely responsible for reduced exposure to poverty: The baseline change for poverty exposure within this group was zero.
Alicja Biała, Iwo Borkowicz and Dominik Pazdzior
Designer Alicja Biała and architect Iwo Borkowicz have aimed to capture the realities of climate change with these colourful Totemy towers that serve as multi-storey data visualisations.

Installed beneath MVRDV's Bałtyk tower in Poznań, Poland, each of the six Totemy sculptures is a nine-metre-tall, geometric wooden tower.

Each of the totems has been designed to communicates a statistic about an environmental issue. For instance, one totem illustrates what has happened to every piece of plastic produced throughout history.

The sculpture is dominated by its blue top half, carved into bold geometric shapes and faintly patterned with swirls. This represents all the plastic that has been discarded as waste.

Below it, slimmer sections in different colours show the fates of the remaining plastic. Green shows it is still in use; red, that it has been burnt. The slimmest section, a mere belt of yellow, represents plastic that has been recycled.

Viewers can access these explanations — as well as links to the statistics' sources — by scanning a QR code on each sculpture.

Biała said she hopes the installation, which will remain at the site permanently, will help to inject climate change into people's conversation.

"We wanted to address the public at large, and at an everyday level," she said. "Passersby on the street and tram will catch out of the corner of their eye a flash of strong colours and be reminded of the current state of our world."

She has been buoyed by the positive response, both locally and abroad, since Totemy opened on 16 May.

"This is particularly important within the state of discourse in Poland where many politicians and public figures manifest climate ignorance, like Polish President Andrzej Duda, who has a rich climate change denial history," said Biała.

"The thing is that our totems are designed to represent science; you may discuss with me, but you cannot argue with facts."
NYSE courtesy of Parsons
The market on May 8 welcomed an initial public offering by Parsons Corp.—detailed in an April 29 U.S. Securities and Exchange Commission filing—snapping up 10.9 million shares by mid-morning, pushing the stock price up to $30.28, well above the opening bell price of $27.

The company planned to offer 18.5 million shares to the marketplace in a transaction valued at $500 million. The company valuation was listed at $ 3.2 billion.

Uber's highly publicized IPO, executed on May 9, was valued at $8.1 billion, below a $10.4 billion figure that had been touted, cautious in the wake of rival Lyft's 20% share drop since its March public debut. Uber is still the largest IPO since 2014. The deals are just two of what analysts said was the IPO market’s most active week in the last four years, with at least 15 deals set.

Parsons stock has been owned since 1984 by its estimated 15,000 employees through an employee stock ownership plan. Industry sector market watchers are optimistic on the outcome for the firm. The IPO would offer about 20% of total shares.

"Parsons had been public from 1959 to 1964 ,,, and always thought we'd go public again some day. We've done a lot of growth though [mergers and acquisitions], and this has helped us continue that path," Chairman Charles Harrington said in a CNBC interview, noting that new "dry powder" gained in the IPO will fuel additional purchases.

It has been expanding U.S. government cybersecurity, intelligence and mission critical work, which observers say is a more stable and less risky arena. "When you're protecting the government's infrastructure and intelligence and missile defense, we think those are pretty secure parts of the defense spend," Harrington said.

The firm recently moved its headquarters to a Washington, D.C., suburb from Pasadena, Calif., to be closer to that client base.

Parsons, which lists its stock on the New York Stock Exchange under the symbol PSN, will use the cash to fuel growth and pay down an estimated $300-million debt.

In January, Parsons acquired OGSystems, a consultant specializing in geospatial intelligence, big data analytics and threat mitigation primarily for government agencies, the firm’s third purchase in that sector in 14 months.

Online financial site Benzinga cites Parsons’ total 2018 revenue at $3.56 billion, with its federal unit comprising 41.5% of that total and critical infrastructure 58.5%. The site says the former unit grew 37% last year, while the latter up just 7.5%. The firm also saw its backlog rise 24.1% year-over-year in 2018 to $8 billion.

Harrington said the firm could also gain in its work in complex airport and intelligent highway systems if new infrastructure spending is passed.

“As time goes on, the convergence of technology and ‘traditional’ engineering will continue to blur the lines of an engineering service provider,” says Michael O’Brien, partner at industry financial consultant Rusk O’Brien Gido. “With an aging demographic, repurchase obligations for closely-held companies have become a big concern for managing cash flow.”
Civil + Structural Engineer
Data Center Powerhouse ScaleMatrix has a Message for the AEC Industry: Bring it On.

Foreseeing the time when AEC firms will face data management issues caused by the mainstream implementation of AI and machine learning, California-based ScaleMatrix says it will be ready.

Mark Ortenzi and Chris Orlando, the high-performing masterminds and co-founders of ScaleMatrix, have invented a hybrid air/liquid cooled cabinet built to house virtually any hardware needed for an organization’s computing needs. With built-in logic, the cabinets are efficient, high-density, closed-loop, and fully modular. And compared to the installation of a traditional data center, ScaleMatrix can reduce the deployment time by as much as 75 percent, a deployment that is measured in days, not months or even years. If this cabinet is the meteorite, the old data center systems are the dinosaurs.

The ScaleMatrix cabinet has the ability to scale from 1kW to 52kW of workload, and it can handle anything an AEC firm can produce, especially as the industry has yet to employ AI and other cognitive technologies on a meaningful scale. However, with AI technology expected to boom in the coming years, that will probably change as engineering firms follow the lead of more progressive segments of the economy.

In a nutshell, data growth leads to compute and density increases – more processors – which leads to more power outputs, and thus increased heat, which leads to heightened cooling requirements. In the old days, the raised floor, the wind tunnel, and the chilled room were sufficient. Ortenzi and Orlando know all about it, because it was in the data center industry where they cut their teeth and made their names. But even as they flourished in that industry, they also saw the need for disruption.

“I wanted to invent a better mousetrap,” Ortenzi said.

Or, as Orlando likes to say, “If you want a cold beer, you don’t put it into a cold room. You put it in the refrigerator.”

With important partnerships with leading companies like Hewlett Packard Enterprise and NVIDIA – ScaleMatrix is a select partner in NVIDIA’s DGX-ready data center program – and now with data centers in San Diego, Seattle, Houston, Charlotte, and Jacksonville, ScaleMatrix upped the ante with the recent acquisition of Instant Data Centers, a deal that adds ruggedized, micro-data centers that can function on the edge – near the action and in remote locations, like a mine.

Even though the technology behind what ScaleMatrix does is perhaps dizzying, the philosophy is quite simple.

“Everything we do in this business is power and cooling,” Ortenzi said. “Next to labor, power is the biggest expense. It takes so many amps to cool so many amps. It takes so many watts to cool so many watts.”

The cabinets have built-in logic that responds to usage requirements, making the variable system “one big, breathing animal that modulates based on requirements,” Ortenzi said. The ScaleMatrix design includes full cooling support, redundant power supply, fire suppression, and integrated network support. When one cabinet gets filled up, just add another one. While ScaleMatrix at first offered cloud and colocation services, it has since added another distinct business line, the DDC™ cabinet for companies that want them for their own data centers.

While the reaction from the market has certainly been favorable – ScaleMatrix had 2018 combined sales of about $20 million and employs 52 people – it wasn’t necessarily instant and overwhelming.

“That’s a great novelty, but who needs that?” Ortenzi said, referring to the initial reaction he and Orlando got when they introduced a system that could handle such a heavy workload.

But all that changed about two years ago, when AI and machine learning came in from the fringe and entered the mainstream. Seemingly overnight, companies were dealing with more data than ever, and ScaleMatrix started fielding calls from all across the country, and even the world.

“All of a sudden, two years ago, all hell breaks loose and no one knew what to do,” Ortenzi said. “We’ve set ourselves up to be in a position to help people. Where else are they going to go?”
DamienGeso/iStock, David von Diemar/Unsplash]
In a broad new set of sustainability commitments, the company wants to use its tech to develop tools to monitor and find insights in environmental data.

In 2012, before declaring your company “carbon neutral” was de rigueur, Microsoft committed to that standard across its operations. Since then, Microsoft has continued to take steps toward cleaning up its own act, purchasing enough green power to equal its electricity consumption, investing in reforestation projects, and setting the target of reducing its emissions 75% by 2030.

Even though Microsoft has worked diligently to advance sustainable practices, its approach, says Lucas Joppa, the company’s chief environmental officer, has remained fairly internal. “We’ve been so focused on reducing the environmental footprint of our own operations–that was really the traditional focus,” Joppa says. Now, the company feels that it’s time to expand its its approach. Through a new set of sustainability commitments, Microsoft wants to turn its sustainability efforts outward, through making its artificial intelligence and tech tools more widely available for use in environmental research, and through new research and advocacy efforts in the environmental field.

“The reason we’re doing this is almost perfectly correlated with impatience,” Joppa says. “The reality shows that no matter how successful we are, sustainability actions inside of our own four walls are entirely insufficient for moving the world toward an environmentally sustainable future.” The same logic applies across the corporate world: No matter how much an individual company works to achieve personal sustainability goals, it’s not going to create the kind of large-scale change we need to combat climate change.

Microsoft’s plan is to turn what it does well–technology and AI–outward to support climate action. It will aggregate and host environmental data sets on its cloud platform, Azure, and make them publicly available (it’s also using AI to make its Azure data centers run more efficiently). Those data sets, according to Microsoft, are too large for researchers to use without advanced cloud computing, and hosting them on Azure should ease that issue.

The company will also scale up the work it does with other nonprofits and companies tackling environmental issues through a data lens. Microsoft has already worked in concert with the water management company Ecolab to develop a tool to assess and monetize a company’s water usage, and how much they would save–both in financial and environmental terms–by driving down their consumption and waste. They’ll also work with The Yield, a company that uses sensors to assess weather and conditions for farmers, to improve the operations of their tools and equip them with AI that will help them predict weather patterns and soil conditions in advance. And they’re equipping SilviaTerra, a startup that uses AI to monitor global forest populations, with the tools it needs to store and analyze vast amounts of data.

Alongside these partnerships, Microsoft is also working to prove that these types of data-driven projects can deliver enormous benefits to both the environment and the economy. Through research conducted with PwC, Microsoft looked at how AI could be applied across four sectors with implications for the planet: agriculture, water, energy, and transportation. “Even just for a few different sectors, and a few different levers in those sectors, a rapid adoption of AI-based technology has the potential to not only make significant gains for the environment, but also for the GDP overall,” Joppa says. Microsoft found that advancing AI usage across those four sectors could boost global GDP by as much as 4.4% by 2030, and reduce greenhouse gas emissions by around 4% in the same time period. “We need to get past the idea that acting on climate will slow economic growth,” Joppa says.
Carrier Johnson
Aside from equipment innovations, the building industry has remained largely unchanged for the last 100 years. Beginning about 10 years ago with building information modeling (BIM) software that began to change, said Daniel Reeves, president of the San Diego–based community and government affairs consultancy Juniper Strategic Advisory, who served as moderator of a ULI San Diego/Tijuana event in March.

Like other business sectors, innovative technology is having a disruptive impact on building construction, operations, and management, according to event presenters, who discussed new technology used to cut time for project due diligence; make cost estimates accurate and construction more precise; improve building operations and efficiency; and enhance tenant engagement, comfort, and satisfaction.

Scoutred

San Diego–based Scoutred offers software that simplifies and speeds up early-stage due diligence for real estate developers and their associates, reducing time for research from days to a few minutes, said founder Alexander Rolek. He explained that Scoutred organizes property information on millions of parcels in San Diego County and visualizes the data in a report designed to help parties make informed decisions.

Simply put in the parcel address and receive an immediate report that details property and zoning information, including subdivision name, parcel size, legal description, the owner’s name and address, tax assessment, map location, use type, building height limits, floor/area ratio (FAR), and setbacks. This information is exported to a PDF format, which also includes the following: a high-resolution aerial photo; zoning and other applicable overlays, such as parking and mass transit; a description of the community plan; details on the property’s attributes and any improvements; and all permits pulled on the property on record with the city.

Willow

Based in Sydney, Australia, Willow, a global software developer, is focused on creating easy-to-use systems that facilitate smart building construction, optimize building performance, enhance user experience, and open new streams of revenue by turning data into value.

The company has partnered with Microsoft to create Willow Twin, a scalable platform that leverages the power of the internet of things (IoT) and artificial intelligence to create 2-D or 3-D digital, geometrically accurate replicas of real estate assets that contain all asset information and live operational data.

“Data is the new gold,” said Casey Mahon, digital coordinator, Willow North America, explaining that the program collects building data and uses them to transform a structure into a living, evolving asset that learns from experience. The program harnesses building data, tracking user behavior and building performance to improve the tenant experience and drive savings through actionable insights and predictive maintenance.

Five years ago, Willow partnered with Investa, one of Australia’s largest developers, owners, and managers of commercial real estate, to develop Willow Twin 2.0, an intelligent digital twin that integrates 3-D visualization with data to allow a building to learn to operate itself efficiently.

Over time, the system learns to effectively manage energy and other resources used by assets: using data analytics and intuitive reporting, it improves assets’ triple bottom line by increasing their cash value while reducing their impact on the environment.

Since then, Willow and Investa have used Willow Digital and Willow Twin to innovate multiple areas of building development and operations, ranging from complex digital design and construction management to use of intelligent digital twins to manage buildings efficiently and enhance the tenant experience.

Mahon noted that Willow Digital 2.0 identifies which assets to track long-term, and lessons learned can be applied across an entire portfolio using Willow Scan, an OR, code-driven solution designed to identify and manage all assets in a portfolio.

​It also provides a completion tracker and model auditor that validates subcontractor and data, including operations and manuals, asset registers, and warranty information and gathers and stores operational manuals for the building or infrastructure network, which Mahon stresses is especially important when handing off building management to a n
HED
In a deal that clearly points to an important growth sector in the AEC industry, Harley Ellis Devereaux (HED) entered the market in data center design by merging with Integrated Design Group, an architecture, engineering, and planning firm with an established reputation in the field.

The leadership and staff of Integrated Design, also known as ID, from its offices in Boston and Dallas will join the HED team with locations in Chicago, Detroit, Los Angeles, San Diego, San Francisco, and Sacramento. The combined firm is now 420 strong.

Zweig Group, an AEC management advisory firm with offices in Dallas, Fayetteville, Houston, Salt Lake City, and Washington, D.C., represented the selling firm.

“The merger between HED and ID is a great match,” said Phil Keil, Zweig Group’s director of strategy consulting and the firm’s project manager on the deal. “ID’s thought leadership in the data center design space will be a valuable growth platform as HED continues to serve clients, especially in healthcare and higher education. This merger with ID also opens up a regional presence for HED in the important and growing markets of Boston and Dallas.”

According to Peter Devereaux, FAIA, chairman of HED, this is a natural step for the firm.

“We are committed to strategic growth that increases the firm’s ability to create positive impacts for our clients and their stakeholders,” he said. “Bringing the ID team into the HED family is a step on our journey toward expanding our expertise and enabling a greater impact for our clients. It also allows us to reach new audiences – both in this new market sector for HED and in all the sectors we serve in the regions surrounding Boston and Dallas.”

HED leadership recognizes that this is an important, fast-growing sector throughout the U.S. and beyond, and see this as a golden opportunity to get into the right sector at the right time.

“Many of our clients, in healthcare, higher education, and corporate work, for example, are seeking this intelligence and specialized expertise,” Devereaux said. “This is an example of our ability to bring additional resources and insight to the table for our clients.”
UNStudio has unveiled plans to create the "smartest neighbourhood in the world" in the Netherlands, where residents produce their own resources and control the use of their data.

Brainport Smart District (BSD) will be located in Helmond, a city in the south of the Netherlands, with the construction of 1,500 new homes over the next 10 years.

Amsterdam-based UNStudio is planning the circular and "socially cohesive" development in collaboration with its sister company, the tech startup UNSense. They envision the area as a "living laboratory" that produces its own food and energy, manages its own waste and controls its own data.

They describe it as "a unique Dutch initiative for future living".

The project includes a section called 100 Houses – a live experiment where UNSense will explore the ethics of data as currency.

Seen as a direct response to the monopoly on data held by big technology companies, 100 Houses will offer its residents – who will be drawn from across Dutch society – full control of their data.

"Residents and communities benefit from their own data"

"The current digital business model is extremely lucrative for tech giants, but is far less advantageous for the majority of local companies, organisations and individuals," said UNStudio.

100 Houses would, alternatively, operate on "the principle that the residents and communities benefit from their own data".

This test environment would be a model for an alternative economic model where users could save money via digital amenities including food and transport, while contributing to the growth of local communities and businesses.

A board of ethics could offer objective advice on issues such as data ownership, privacy and commercial gain for the end user. UNSense is working with Tilburg University on the scheme, with the aim of eventually scaling the model if it is successful.

Garrett Rowland
The next generation of intelligent buildings offers promise for unseen levels of energy efficiency, optimization, and occupant health and productivity.

Buoyed by a surge of high-tech innovations and several years of robust U.S. construction markets, AEC teams are working on ideas for “smart buildings.” Since the mid-1980s, a new generation of products, technologies, and analytical tools has transformed the building landscape. The benefits of “smart” technologies and operations for design, construction, and ownership/operations are now inescapable.

Prior to the 1990s, the notion of intelligent buildings focused on controls and automated processes for building operations, mainly in HVAC, lighting, and security systems, says Joachim Schuessler, Principal with Goettsch Partners. “Then, about 15 to 20 years ago, we started working on buildings that optimized controllability and comfort for the users,” he says. By the late 1990s, tools like building information modeling were making built projects a digital extension of the architectural/engineering and fabrication processes, with valuable impacts on downstream operations such as facility management.

The latest definitions of smart buildings embrace a much broader, more futuristic outlook. Schuessler and other experts describe the new paradigm as buildings and building portfolios created and operated using technology systems that aggregate data, make decisions, and continuously optimize operations with ongoing predictive feedback, including from building systems and occupants.

David Herd, Managing Partner with BuroHappold Engineering, asks: “Do the building’s design and systems anticipate programmatic change over time? Is it a ‘well’ building that helps keep people healthy? If it’s smart, today’s thinking goes, it can accomplish these goals, and more.”

Tech-enabled properties transcend time and place, too. “Smart buildings can also be defined as connected buildings,” says Marco Macagnano, PhD, Senior Manager, Lead: Smart Real Estate with Deloitte Consulting. They are “the product of an omni-channel approach focused on generating meaningful information to support decision making through data analysis.”

Connected systems should add practical value while protecting against hackers and other breaches. They can benefit O&M by tracking energy-use intensity (EUI) across multiple campuses or by alerting a facilities department that an escalator is in jeopardy of failing. Owners can use the cloud and the Internet to access existing systems to do more. Bring in the ability of Big Data to tap into worldwide reporting on facility operations, and building owners can suddenly identify patterns and trends that could lead to better design choices.

“The biggest difference with current smart buildings is that tech is the enabler of three primary pillars: sustainability and carbon neutrality, the well-being of users, and user-centered design,” says Jan-Hein Lakeman, Executive Managing Director of Edge Technologies and OVG Real Estate USA.
John Durant
As data evolves to information and then knowledge, so will architecture.

Charlie Williams, AIA, is associate principal and project delivery director with the California-based integrated design firm LPA and 2019 chair of AIA’s Technology in Architectural Practice (TAP) Knowledge Community. Williams is also responsible for Inspire Design, the LPA technology team dedicated to “inspiring the design process through technology.” He is passionate about helping LPA leverage its informed process through data-driven design.

I’ve come to learn that successful technology implementation has a lot to do with change management. When I first stepped into this role, I thought it would be mostly about getting the right tools installed and explaining why they’re valuable. But I realized that’s only 10 percent of the job; the other 90 percent is understanding people and how they respond to change. As a result, I try to incorporate any excitement around new technologies while focusing most of my energy on the “people” side of things.

As an integrated design firm, LPA aims to incorporate input from each discipline at the earliest stages of a project, and then refine our insights to arrive at a solution that is highly informed by research and data. This new process is moving us away from the idea of intuition leading our design efforts. This doesn’t mean you change gears entirely; architects and engineers have been operating on intuition and professional experience for quite a while. But it’s time to meld intuition, experience, and data-driven design thinking.

Data, in conjunction with new technologies, will change how we design our buildings and how we interact with our clients. The greatest advancements in this area can come from anywhere; they won’t be constrained to academia or larger firms. TAP’s Building Connections Congress, which I am chairing in 2019, will also focus on this idea by bringing speakers together to discuss how the evolution of “data to information to knowledge” will advance practice.

If you look at the people who’ve worked passionately with data and found success in transforming sports or business, many were not initially industry leaders. Consider Michael Lewis’ book Moneyball. The Oakland A’s were the least likely team to transform baseball—or all of sports for that matter—at the turn of the century, but they did just that in 2002 through data-driven decision-making. I don’t think the next big thing necessarily comes from a large firm with copi
Sidewalk Labs
Google affiliate Sidewalk Labs has prompted a backlash over data privacy with its Quayside smart-city project. Alex Bozikovic reports on what's been overlooked amid the controversy.

This smart city of the future first appeared in cutesy sketches. Drawn in a cheerful palette were a kayaker paddling in a harbor, a dad pulling a little one in a bike trailer, children running hand-in-hand through a carless streetscape. There were gondolas and pergolas, and underground robots carrying waste. And, vaguely, in the background, there were also buildings.

This was the vision for Quayside, a new waterfront neighborhood in Toronto conceived by “Sidewalk Toronto,” a partnership between a local public agency and Sidewalk Labs, a New York–based unit of Alphabet, Google’s parent company. “By leveraging technology and combining it with really smart, people-centric urban planning,” Sidewalk Labs CEO Dan Doctoroff said at the time, “we could have really dramatic impacts on quality of life.”

Sidewalk Toronto was launched in October 2017. A year and a few months later, the vision for Quayside remains only slightly less vague than those initial drawings. The 3 million-square-foot project promises to include many of the hallmarks of smart-city ventures: “dynamic streets” designed for autonomous vehicles, “radical-mixed-use” buildings featuring “power-over-Ethernet,” and a novel approach to retail and service space that prioritizes pop-ups over long-term leases. The project also promises to inspire meaningful innovations in construction and real estate practice. “We’re putting forward new technologies that have not been integrated before,” says Karim Khalifa, a mechanical engineer who is the director of buildings innovation for Sidewalk Labs. “The project includes prefabricated mass timber at a scale that has never been attempted.”

Perhaps most importantly, Quayside promises to generate endless streams of data—from buildings, road sensors, traffic signals, and other sources—with the promise that they will make the development more efficient, safe, and pleasant. Local resistance to the plan has mounted, however, as residents of various political stripes have raised a provocative series of questions. Who will control that data? What does a tech-inspired, Google-affiliated city mean, technologically, socially, economically, and politically? What, exactly, is Sidewalk trying to build?

An Instigator, Not a Developer
Quayside is the first major project by Sidewalk Labs— a showpiece that the company hopes will define its reputation in the field of “urban innovation.” It
Rudin Management Company
It’s common to associate the term “big data” with targeted advertising and invasions of privacy, but for New York City-based commercial real estate management company Rudin, the massive sets of numbers can help save the world.

In response to requests from utility companies and government entities to reduce emissions and prevent catastrophes like the northeast blackout in the summer of 2003, the firm launched a tech startup, Prescriptive Data, to find a way to use big data sets, technically referred to as silos, to answer these calls.

The startup combines data from Rudin’s buildings with numbers from utility companies into an operating system they call Nantum, which regulates heat, elevators and other property machinery. The results, Rudin found, include more comfortable conditions for its tenants, along with a massive reduction in carbon emissions. At the end of 2018, the 19 properties using Nantum together achieved a 44% reduction in emissions, more than half of the 80%- by year 2050-goal established under the New York City Carbon Challenge.

However, since data collection is still a relatively new science, and a confusing one for many, I asked Rudin’s executive vice president and COO John Gilbert to explain how Nantum used it to achieve the recent sustainability milestone.

How did you get the idea for a data tech startup?

In 2009, President Obama put together a stimulus package following the recession of 2008, and a chunk of that money was to go to utilities to create some innate intelligence within the grid. It was really to prevent the big blackout in August of 2003, when 55 million people in the northeast and southern Canada lost power, from ever happening again. We were reached out to by Con Edison and they said, “Listen, we got this chunk of [money] and we want to talk to some smart customers of ours. You, Rudin, have had a history of integrating very innovative technology within your portfolio, will you act as the smart technology petri dish for us?” And we said sure.

Champs-sur-Marne
In 2016, Ecole des Ponts ParisTech has established an advanced masters program with a focus on digital fabrication and robotics. Currently recruiting for its fourth installment, the Design by Data Advanced Masters Program appeals to architects, engineers, and tech-oriented designers. Since its launch in 2016, the program’s director Francesco Cingolani has sought to shape the relationship between architecture and technology by creating a cross-disciplinary culture between the two.

As previously mentioned on Archdaily, students study the main components of the program - computational design, digital culture and design, and additive manufacturing and robotic fabrication - throughout the 12-month program to fulfill Design by Data’s main objectives while working with peers in a dynamic learning environment. While providing each participant with both technical skills and an aesthetic eye, the program ensures students will also gain critical knowledge of current innovative trends and ongoing research. By exposing them to technology through hands-on use of tools of digital fabrication, the program will teach students to approach design through a process-oriented lens.

"Computational design, to me, is a completely new way of thinking about architecture and design that merges digital arts and engineering. Computational design is mainly about how we can use algorithms, mathematics, and generative thinking to create a novelty of architecture and object with complex geometries that are not standard.” --Francesco Cingolani, co-founder the Design by Data Program.

The program created a Makerspace, an interdisciplinary learning platform for prototyping. Makerspace "fosters interdisciplinarity between the various fields of expertise represented in the school and in neighboring schools." According to Ecole des Ponts ParisTech, students develop transferable skills in various disciplines by making and coding. They view the "Designer-Builder" "in this environment as a strategist, capable of conceiving and leading new methodologies for problem-solving."
Lake|Flato Architects and Matsys Design
The start of a new year may be the impetus you needed to reassess professional priorities and investments for the near and long term. Continuing our annual tradition of identifying technologies and business strategies to set you and your firm up for success, ARCHITECT asked nine digital leaders in tech-forward practices across the country to identify the changes they anticipate in the design profession. To gauge whether they're walking the walk, we also asked them to summarize what they've resolved to accomplish in 2019, with (imaginary) bonus points for brevity.

Map the Invisible and the Invaluable

Fred Perpall, FAIA
Chief executive officer, The Beck Group, Dallas


Prediction: In 2019, we’ll see firms learn how to use data more effectively. It is a challenge to accurately predict the length of time a task takes, yet every day we take leaps of faith and sign ourselves up for projects that may be impossible to achieve. It is no secret that our industry has been historically slow to adopt technology and tech driven practices. Often, we rely on gut instinct rather than using readily available data to make informed decisions to improve projects. Some of this is a cultural problem but it is also a challenge to understand what to do with data.

At Beck, we’ve developed in-house applications that use historical data and BIM to advise clients on what is possible when it comes to a project’s design, cost, and duration with a high surety factor. We recently used this practice during the planning phase of a project with a large university and determined the deadline to complete the project was not enough time. Because we were in planning, we course-corrected the schedule before time and money became an issue, all thanks to data and our technology.

Resolution: We’re putting prefabrication practices into our design workflow. We recently established a facility and are scaling towards prefabricated solutions.

Andrea Love, AIA
Principal and director of building science, Payette, Boston


Prediction: Architects are on the precipice of embracing big data and leveraging it to inform design. I see this more and more across a number of aspects of architectural design, whether it be programmatic or performance data. In my world of building science, we are at this interesting juncture in computing abilities and tool availability where you can now, say, set up an energy m
The ever-increasing number of connected devices and the need to store all the resulting data has led to a rise in U.S. data center construction that shows no signs of slowing down.

Apple recently announced that it would invest $10 billion in data centers in the next five years — a move that puts it at the forefront of investment in such projects but certainly not on its own. In fact, according to a recent Insight Partners report, the value of the international data center construction market was $43.7 billion in 2017, and, at a compound annual growth rate of 10.2%, is expected to be $92.9 billion in 2025.

The data centers being built fall mostly into two categories — those that will serve only the needs of huge, data-heavy companies like Google and Facebook and those being built for collocation firms like CyrusOne. Both types of centers include those classified as “hyperscale,” which are made flexible enough through energy efficiencies and other optimizations to be able to expand and meet demands of big data, cloud computing and enterprise customers.

So while Amazon and high-tech cohorts like Apple build these facilities for their own exclusive use, collocation data centers provide server space, bandwidth and equipment for multiple companies under one roof.

According to Pat Lynch, senior managing director for CBRE's data center solutions division, there are relatively few locations in competition for one of these facilities. There aren’t many areas of the country, he said, that can offer the winning combination of low power and land costs, plus the incentives — i.e. property tax breaks and other perks — that attract big players in the space.

Facebook, for example, is building a $750 million data center in Huntsville, Alabama, and will receive a $6.6 million incentive package from local agencies in return, including $2 million in waived permit fees and $4.6 million of infrastructure work around the new data center.

Another key to successful recruitment of a data center, he said, is a solid partnership with the local power company, which must be prepared to make a major investment in infrastructure to meet future energy needs.
McKinsey
How do the best design performers increase their revenues and shareholder returns at nearly twice the rate of their industry counterparts?

We all know examples of bad product and service design. The USB plug (always lucky on the third try). The experience of rushing to make your connecting flight at many airports. The exhaust port on the Death Star in Star Wars.

We also all know iconic designs, such as the Swiss Army Knife, the humble Google home page, or the Disneyland visitor experience. All of these are constant reminders of the way strong design can be at the heart of both disruptive and sustained commercial success in physical, service, and digital settings.

Despite the obvious commercial benefits of designing great products and services, consistently realizing this goal is notoriously hard—and getting harder. Only the very best designs now stand out from the crowd, given the rapid rise in consumer expectations driven by the likes of Amazon; instant access to global information and reviews; and the blurring of lines between hardware, software, and services. Companies need stronger design capabilities than ever before.

So how do companies deliver exceptional designs, launch after launch? What is design worth? To answer these questions, we have conducted what we believe to be (at the time of writing) the most extensive and rigorous research undertaken anywhere to study the design actions that leaders can make to unlock business value. Our intent was to build upon, and strengthen, previous studies and indices, such as those from the Design Management Institute.
tarras79/iStock
A recent study of how employees communicate in open plan offices seemed to be the final nail in the coffin of this popular workplace design. But the study had an essential flaw, writes architect Ashley L. Dunn.

Recently, a team of Harvard researchers set out to determine whether open plan offices help employees interact with each other. Open plan offices have come under intense scrutiny, as studies link the design to poor acoustics and employee performance, but companies continue to build them, because they’re cost-effective, and they’re believed to foster communication and collaboration among employees. The Harvard study wasn’t exactly encouraging, at least not at first glance: It concluded that wall-free offices encourage workers to talk less and email more. Collaboration appeared to be reduced in the subjects’ workplaces.

The study attracted a lot of attention–including in this publication–in part because it was the first to objectively measure how workers communicate in an open plan design through microphones and electronic badges. Yet the study had an essential flaw: The extreme open plan offices they studied.

The study only tested how much collaboration happens in poorly designed and extreme open plan offices with absolutely no walls or partitions. But the vast majority of offices used by leading organizations that are categorized as “open plan” have collaboration spaces and dedicated areas for private conversation.

Extreme open offices–work areas without the choice of meeting rooms, breakout spaces, or telephone booths–are unlikely to appeal to business managers who want to keep employees happy. Furthermore, a designer well-versed in workplace strategy would never suggest such an environment for a client.

–– ADVERTISEMENT ––

Every office needs places where employees feel free to talk–which typically requires some element of privacy. As we’ve found in our line of work, designing offices for various organizations and Fortune 500 companies, a thoughtful mix of open and closed spaces is key to any successful office design.

Individual spaces need to be assessed case by case and not lumped into a “big data” mind-set that rewards quantitative measures over qualitative ones. There’s a range of potential workspace designs between traditional offices and the totally open plan.