PRESS

Nobias Nobias

TECH TALKS: Artificial intelligence created filter bubbles. Now it’s helping to fight it.

But fake news is just part of the problem, Ahuja points. AI-based personalization causes another problem at the individual level, which is more hidden and perhaps more dangerous. “Ultimately you get wrapped in a filter bubble. For instance, in the case of political bias, if you’re always reading articles that are left-leaning, you will end up with a news feed that is primarily left-slanted, and publications with the other point of view will get buried in your news feed,” she says.

What would happen if a news publisher printed a different copy of its newspaper for each of its subscribers? That is a hypothetical question that would have sounded ridiculous two decades ago, when print newspapers and broadcasters were still the prevalent channels for receiving news.

But those days are slowly come to an end thanks to an explosion in artificial intelligence. Presently, more and more people consume news, articles and other information through online curators like Facebook and Google News. These platforms use AI algorithms to create digital profiles of users and deliver content that conforms to their preferences.

No two users see the same news feed in their social media accounts and their news curation apps. AI-driven news curation makes sure you see “relevant” results, as the curators will tell you. But in effect, these algorithms submerge you in a filter bubble, where you see too much of what you want to see and too little of what you should see.

The direct result of filter bubbles is polarization of societies, amplification of biases, less tolerance for opposing views, and more vulnerability to fake news. We’ve already seen these phenomena create or intensify crises during electionspolitical rivalries and sectarian violence in different regions of the world.

There’s now increasing speculation on different regulatory, social and technological solutions to rein in the control of algorithmic curators. And there are various efforts being led to define the ethical boundaries of AI.

Interestingly, one of the solutions to overcome our biases is artificial intelligence, the same technology that contributed immensely to the problem in the first place.



How AI creates filter bubbles

Before we lay all the blame on Facebook and Google and other organizations that use AI algorithms to curate our news, we must understand that these organizations are addressing a real problem. However, they might not be the best party to solve it

Contrary to the age of print newspapers, where publication came with a hefty price tag, in the age of online content, anyone can set up a blog and start posting their heart’s content without spending a dime.

The democratization of publishing has created an explosion of content. In the U.S. alone, there are more than 30 million blogs. Every day, more than 4 billion pieces of content are shared on Facebook and 500 million tweets posted on Twitter. Google News tracks thousands of websites ranging from blogs (like this site) to huge publications such as The New York Times. Even Apple’s news subscription service, which is being criticized for not having a lot of news, tracks 300 major publications.

How can users find their way through this sea of information? “We’re seeing a lot of content being published every day, and there’s no way for consumers to see or read all of it. That’s why online platforms and news providers use algorithms to distill it in some way,” says Tania Ahuja, founder and CEO of Nobias, a New York–based startup that uses AI to track political bias in online content.

Machine learning algorithms, a popular subset of artificial intelligence, are especially good at finding patterns and correlations across large data sets. News curators and social media companies use machine learning to detect the common characteristics of the kind of content users interact with (clicks, likes, retweets, etc.). The AI algorithms then use this information to recommend other content that have similar characteristics or find users who engage with similar content.

Even when filtered down to the preferences of the individual user, there’s still a lot of online content to choose from. That’s why machine learning algorithms usually prioritize them based on user engagement. The more users interact with a specific piece of content, the likelier it is that it appears other users’ news feeds.

At first glance, this seems like a win-win proposal. Instead of wasting precious time scrolling through their news feeds or adjusting their search queries to find useful news and information, users let an AI algorithm take care of the task for them and show them “relevant” results. This means they spend less time searching and more time reading and watching content. As for the companies that run these AI algorithms, they get to keep users glued to their apps and monetize their attention by showing them ads.

But in reality, it’s a zero-sum game. And the user is always the loser.

As Ahuja points out, these AI algorithms create several critical problems. “They reward junky clickbait because the users click on articles that have sensational titles,” she says.

During the 2016 U.S. presidential elections, a group of Macedonian youth created a misinformation crisis by posting stories with sensational headlines about things such as the criminal indictment of Hillary Clinton or the pope’s approval of Trump, both of which never happened. When shared on Facebook, the stories got many clicks by intrigued users. The high level of engagement caused Facebook’s AI algorithms to recommend the stories to other users, which generated even more clicks, and so on.

In the same year, high user engagement caused Facebook’s algorithms to push a false story about the firing of Fox News anchor Megyn Kelly into the platform’s “Trending Topics,” showing the fake news to millions of users.

But fake news is just part of the problem, Ahuja points. AI-based personalization causes another problem at the individual level, which is more hidden and perhaps more dangerous. “Ultimately you get wrapped in a filter bubble. For instance, in the case of political bias, if you’re always reading articles that are left-leaning, you will end up with a news feed that is primarily left-slanted, and publications with the other point of view will get buried in your news feed,” she says.

Like many others, Ahuja is worried that as AI personalizes content, it will influence people’s perceptions on really important topics like politics and business, leading to a fractured and polarized societies, making it easy for bad actors to manipulate people.

The most well-known case of manipulation is Facebook’s notorious Cambridge Analytica scandal, in which a political data firm tried to leverage the company’s vast store of personal information to influence the behavior of millions of U.S. voters. Evaluating the effectiveness of such campaigns is very difficult, but Cambridge Analytica shed light on the dangers of the granularity of control that AI algorithms provide over the content that users see.

Privacy and ethical controversies over the use of artificial intelligence to curate online content have raised awareness on the issue and spurred the companies using these algorithms to alter their practices. Lawmakers in various regions are also speculating regulations that will rein-in tech companies’ practices to collect and mine user data with AI algorithms.

But the process is too slow, Ahuja believes. “It’s not going to come soon, and also, to some extent, the business model of a lot of these platforms are predicated on free access to their platforms, and is financed by digital ads, which in turn is driven by engagement and sharing rather than the actual value of the news,” she says. “I don’t think the tech companies that run these platforms should be deciding what we see.”

So basically, it’s hard to trust companies that built economic empires over monetizing user attention to undercut themselves.

Tracking bias in your news feed

“While I’m not saying AI itself is causing these problems, it’s just that the nature of this onslaught of content, and AI being used to help with it, is actually really not helping because people can’t fully appreciate what is really going on,” Ahuja says.

Ahuja believes it is the civic duty of every person to make sure they read news from a variety of sources and keep themselves informed on all sides of a story. But what tech companies like Nobias can do is help users identify the slant of the news they read and discover the biases they might be unintentionally developing.

To do this, Nobias has developed what Ahuja calls a “Fitbit for news,” a free browser extension that reports on the political bias of online articles and keeps track of the general political standing of the kind of content users consume.

“We created Nobias as a tool to help customers take back control of their news feed. We let them know if they’re only reading one side of a story. They might choose to do so—it’s their prerogative. But at least for those who would want to have access to both sides of information, then we give them this information right in their news feed so they can be selective of what they read and develop their own unique point of view,” Ahuja says.

After installing Nobias, when you browse to news feeds such as Google News, Facebook, and the home pages of popular news sites, the extension will decorate the page with colored pawprints that indicate the political slant of each article. Blue pawprints stand for liberal content, red for conservative, and purple for center. There’s also a gray color that indicates Nobias could not determine the slant.

Nobias tracks bias at the article level. So as the above picture shows, some of the articles in the New York Times homepage, clearly a left-leaning publication, are marked as conservative.

Nobias also provides information on the credibility of the source of an article and the level of readability. Inside articles, the tool will provide analysis of the bias of outbound link sources where it can.

At the user level, Nobias will provide you with a detailed summary of the credibility and bias of the news you consume (of course, as long as you’re reading it in your Chrome browser). This can help you find out if you have a balanced diet of online content or you’re too biased toward a specific point of view.

 
 

Nobias won’t change your news feed. But it will give you a clearer picture of the content you’re consuming online and whether your news feed is showing you everything you need to know.

“It’s important to make people aware that there are AI algorithms at work and that they need to be mindful of their interactions. We want to act as an independent party that helps people get a healthy feed of well-rounded news,” Ahuja says. “We feel that more informed people are less likely to fall for fake news. Fake news works because it targets people who have very little information and don’t have the big picture.”

Using machine learning to fight filter bubbles

Interestingly, the technology at the heart of Nobias is machine learning, the same technique that has had an important role in amplifying biases and creating filter bubbles.

Nobias has published a detailed account of the methodology it uses to grade the bias in news stories and articles. “For the bias algorithm, what we did was we tried to be as objective as possible. We used published methodology by Matthew Gentzcow and Jesse Shapiro in Econometrica, a top economics journal, in 2010 and 2016,” Ahuja says.

The company used records of speeches in Congress as training data for the AI algorithm that detects slant and political bias in online content. “On Congress, you know if somebody is Democrat of Republican. It’s a lot more objective,” Ahuja says.

After being trained on the text of the speeches, the machine learning algorithm develops a model of the kind of keywords that draw the line between left- and right-leaning speech. It then uses these correlations to classify new content. The AI algorithm uses thresholds and multiple levels of bias (far-right and -left, center-right and -left, and center) to reduce false positives.

If users don’t agree or find an error in the analysis of a specific article, they can send feedback to Nobias. The company uses this crowdsourced information to further tune the AI algorithm and make it more accurate. “We update the machine learning algorithm every month, but if we receive a lot of feedback on something that is being discussed a lot, then we will update it more quickly,” Ahuja says.

One of the things that worried me was that bad actors might try to trick or game the AI algorithm by carefully wording their content in ways that might go unnoticed to human readers but have a totally different meaning to the machine learning model.

“Honestly, we’re not large enough to become a worry yet,” Ahuja says, adding that, “What we do is update the machine learning model monthly, so it will be hard for malicious actors to constantly game the AI in their favor.”

Ahuja also says Nobias will not publish the list of keywords that define the classes of political slant, which will make it harder for malicious actors to attack the AI models.


You’re responsible for your own biases

Experts and visionaries suggest that in the age of machine learning and automation, one thing that can protect us against the adverse effects of AI algorithms is to use more AI.

Ahuja makes no such claim with Nobias. “We want to make it easier for people to discover their own biases. That’s why we view this more as a productivity tool. You know that you need to have a balanced diet. This just helps you to get there without much work,” she says.

What this means is that it’s still our own responsibility to step out of our filter bubbles, go out of our way to investigate the different facets of a story, learn to listen to and tolerate opposing views, and try to find common grounds and compromises in debates.

Those are hard human choices that no AI will make for you.


This article has previously been published on Tech Talks.

Read More
Nobias Nobias

DAILY DOT: Can a browser extension combat political bias in your news diet?

Screen-Shot-2019-05-11-at-9.55.51-AM.png

How balanced is the content you read online? How much do you respect opposing views? These are questions that are becoming more and more important as the news we consume online are becoming more and more personalized by algorithms.

In the past few years, there’s been increasing focus on the adverse effects of filter bubbles. Platforms such as Facebook, Twitter, and Google use AI algorithms to tailor their content to show us things that confirm our preferences. While this gives us a false sense of satisfaction and keeps us glued to the applications of these companies, it also amplifies our biases and makes us less tolerant of things that don’t confirm our views.

New York-based startup Nobias believes the first step toward solving the negative effect of content curation algorithms is to educate users about their biases and help them maintain a balanced diet of news content. And ironically, they’re using AI algorithms to address the problem.

An explosion of online content

“We’re seeing a lot of content being published every day, and there’s no way for consumers to be able to see or read all of it. That’s why online platforms and news providers use algorithms to distill it in some way,” says Tania Ahuja, founder and CEO of Nobias.

These algorithms fill users’ news feeds with content that they’re more likely to engage with. The more they click on the items they like, the more their content becomes personalized to their tastes.

“Ultimately you get wrapped in a filter bubble. For instance, in the case of political bias, if you’re always reading articles that are left-leaning, you will end up with a news feed that is primarily left-slanted, and publications with the other point of view will get buried in your news feed,” Ahuja says, pointing out that these algorithms are influencing people’s perceptions on really important topics like politics and business, creating a fractured society with hyper-polarized groups.

“This makes it easy for bad actors to manipulate people. This is really causing a problem in the society,” Ahuja says.

While acknowledging that in the past few years, increased awareness and activism has forced platforms such as Google and Facebook to update their algorithms to address concerns about consumer privacy, fake news, echo chambers, manipulation, and bias, Ahuja believes that we’re nowhere near solving the problem in a fundamental way.

“The process is very slow, and also, to some extent, the business model of a lot of these platforms are predicated on free access to their platforms, and is financed by digital ads, which in turn is driven by engagement and sharing rather than the actual value of the news. I don’t think the tech companies that run these platforms should be deciding what we see,” she says.

The Nobias browser extension

Nobias was created as a free tool to help users take back control of their news feed, Ahuja explains. The point is not to curate content for users, but to warn them if the news they’re reading does not represent all viewpoints on a topic.

“We let them know if they’re only reading one side of a story. They might choose to do so—it’s their prerogative. But at least for those who would want to have access to both sides of information, then we give them this information right in their news feed so they can be selective of what they read and become aware of what other people are sharing and reading so they can develop their own unique point of view,” Ahuja says.

The tool, which Ahuja describes as a “Fitbit for news,” is a Chrome extension.

Once installed, Nobias will decorate news pages with colored paw icons that show the political slant of the featured articles. The paws score news items between left (blue) and right (red), with an unbiased center (purple) standing in the middle. There’s also an unknown (grey) paw for articles Nobias can’t classify. When hovering over the link to each article, Nobias will give you a quick summary of its political slant, the credibility of the news source and the readability of the content.

The tool works in platforms that curate news such as Google News, Facebook, and tracks popular news sites such as New York TimesWashington Post and Fox News, with more outlets being gradually added to the list.

Nobias works at the article level. This means that aside from the general political alignment of a publication, the tool also analyzes the content of each article and gives an assessment for that specific article. “The articles themselves might have features that are left- or right-leaning. But a lot of articles are also center and you can get factual information from them. Also, some center articles might have both left and right features that cancel each other out,” Ahuja says.

Nobias will also let you track your diet of online content by giving you a summary of the general political bias of the articles you read as well as the credibility of the sources you get your news from.

 
Screen-Shot-2019-05-11-at-9.56.21-AM.png
 

Using AI to determine slant and bias

Determining the political slant of an article is not an easy task. To overcome the challenge, the team at Nobias employed a published methodology by Matthew Gentzcow and Jesse Shapiro in Econometrica, a top economics journal, in 2010 and 2016, which uses the congressional record of speeches to set baselines for political alignment. (The full methodology can be found here.)

“In Congress you know if somebody is Democrat or Republican. It’s a lot more objective. We assume Democrats are left-leaning and Republicans are right-leaning,” Ahuja says.

The company used the transcripts of speeches to create an index of keywords that can be considered representative of left and right views. They then used the gathered data to train a machine-learning algorithm that can determine the determine the slant of news articles. Machine learning algorithms are AI programs that analyze large sets of data and find common patterns and correlations between similar samples. They then use these insights to classify new information.

In an oversimplified example, if you train a machine learning algorithm with a large set of speech records, each labeled with its corresponding political standing (left or right), it will find the common traits between each of the classes. Afterwards, when you give the algorithm the text of a new article, it will determine its political slant based on its similarities to the samples it has previously seen.

According to Ahuja, Nobias’ algorithm can determine slant with a 5-10% error rate. If users feel an article has been wrongly classified, they can submit their opinion. The developer team will use the feedback to retune their algorithm.

The civic duty to read balanced news

Nobias is a good example of putting AI to good use to counter some of the negative effects that algorithms themselves have created. However, it’s not a perfect solution.

“We want to make it easier for people to discover their own biases. That’s why we view this more as a productivity tool. You know that you need to have a balanced diet. This just helps you to get there without much work,” Ahuja says.

At the end of the day, a tool like Nobias will help you identify your biases—but it’ll be your own responsibility to overcome them. You’ll have to move out of your way, investigate the news on your own, study the different angles of a story, and learn to read and respect views that oppose yours. That is something that no amount of automation can do in your stead.


This article has previously been published on Daily Dot.

Read More
Nobias Nobias

FORBES: Veer Into The Other Side: How To Fight Implicit Bias In The Digital Age

The side-view mirror on the passenger side of your vehicle was once an add-on.  But over time, what was once considered a luxury feature became a necessity. As the number of multilane roads and highways grew, it became harder to look around us as we drove. With progress came the need for new technology.

I can think of no better analogy for where we find ourselves today with regard to our online political discourse.

There are people to our left. There are people to our right. But more often than not, we’re cruising down the information superhighway, drafting closely behind whatever car looks the most like ours, seldom aware of our massive blind spots.

Then we’re shocked when a car veers into our lane (or we veer into theirs) and a catastrophic collision of ideas occurs. If only there were some way to know our political blind spots.

Our Implicit Blindness

Here’s the foundational problem: We all have blind spots. Technically, these blind spots are known as implicit bias. Emily Badger at the New York Times defines implicit bias as “the mind’s way of making uncontrolled and automatic associations between two concepts very quickly.”

Like any rule of thumb, this process can both help and hurt us. For instance, implicit bias lets us follow familiar patterns like navigating our morning commutes. But it can also reinforce bad intuitions like negative stereotypes, which can lead to harmful results down the road.

To make matters more frustrating, we’re blind to our blindness. As psychologist Daniel Kahneman wrote in Thinking, Fast and Slow, “We have very little idea of how little we know. We’re not designed to know how little we know.”

The Challenge To Interrupt

So, how can we reveal our blind spots?

1. Train yourself to detect website bias. 

The next time a divisive issue explodes on your social media timeline, consider conducting this “exercise for bias detection.” By collecting screenshots of several news outlets across the political spectrum after news of a notable event breaks and ordering the screenshots from most extreme right to most extreme left, you can gain a better understanding of a website’s implicit bias.

However, this is a time-consuming process. It’s a helpful process for a one-time study on your own, but it’s not feasible for every major news story.

There are also some other quick options. AllSides is a website that reveals the biases of news articles. AllSides and Media Bias Fact Checkcrowdsource their bias ratings. Using such sites might save time when reading a broad survey about any news story, but it's important to ascertain the credibility of a particular rating system.

2. Recognize how sites (even 'objective' ones) identify and respond to your biases. 

The effect of an implicit bias goes far beyond which articles you read; with every click, your choices inform which ads you see and what types of articles pop up at the top of your future Google searches.

Rather than working to expand your information horizons or identify questionable content, major sites and search engines function by doubling down on your implicit bias. If you click on three articles about your favorite sports team, you might only see positive coverage of that team in the future.

Don’t think of your internet activity as searching through a library catalog; it's more like having a constant conversation with assistants who only want to give you what you like.

Needless to say, compounding your bias online can be unproductive and even dangerous. So as far as that goes, the best advice is simple: Think before you click.

3. Confront your implicit bias and take control of your information. 

Collectively, we must daily fight against our implicit biases (particularly in today’s political climate).

There are more traditional ways to check our biases: We can stop relying on just one outlet for our news, we can pay more attention to individual authors and we can go out of our way to fact-check our sources. But sometimes -- as with the advent of the side-view mirror -- a new technology can go a long way.

Just as we need multiple tools to keep us as safe as possible in our cars, new add-ons and sites can help us keep our minds on the right track. And when it comes to new tools, artificial intelligence (AI) and machine learning may hold the key.

How AI Could Thwart Our Biases

As Tomas Chamorro-Premuzic argues in a recent Forbes column, “AI has two big advantages over human beings: first, unlike humans, AI can learn to ignore certain things, such as gender (which, in effect, means unlearning certain categories); second, unlike humans, AI doesn’t care about maintaining a positive self-view (and failing to maintain a positive self-view will not make it depressed, for AI has no self-esteem to protect).”

Self-driving cars have no blind spots. Their AI constantly surveils the areas around the car, alerting the passenger to any possible dangers -- and even taking control of the car when necessary without human involvement.

Now it's time to see if AI can also help us address our blind spots online.

To that end, my company created a Chrome extension that places a discrete icon next to Facebook news posts and shares, Google News search results and links in your favorite news sites. Hovering over the icon reveals the source’s political slant and credibility rating based on reliable data and a proprietary AI algorithm.

Knowhere is another tech company looking to combat this issue by training AI to rewrite news stories that are devoid of bias. According to SingularityHub, "The site uses AI to aggregate news from hundreds of sources and create three versions of each story: one skewed to the left, one skewed to the right and one that’s meant to be impartial."

So look over your shoulder and check every mirror. With help from AI tools, we're about to change lanes.


This article has previously been published on Forbes

Read More
Nobias Nobias

TECH RADAR: Should you trust the news you're reading? This browser extension can help

How do you know whether you can trust the news you're reading online? Sites like Facebook show us articles based on ones we’ve clicked on before, reinforcing our existing views in a way that makes us feel better about our opinions – regardless of accuracy. Rather than challenging us to see things from a different angle, they push us into a bubble that echoes our own thoughts back at us.

The big news sources are starting to take notice. Microsoft recently added a browser extension called NewsGuard to its mobile web browser, which notifies you if you’re visiting a site that’s known for sub-par reporting, and Facebook employs fact-checkers to warn users about fake stories – but you don’t have to rely on the big players acting responsibly. There are also various tools around that can help you decide whether an article is worth your time reading, and whether you can trust what it tells you.

Nobias is a free extension for Google Chrome that serves a similar purpose, helping you determine whether the article you’re about to read is trustworthy, and telling you if the site has any noticeable political leaning. It can also estimate how long the story will take to read, helping you manage your time.

We are all living in information bubbles, which polarize us against each other and keep us underinformed
— Dr Tania Ahuja, Nobias

“Readers are drowning in so much online information that it is almost impossible to keep track of what is credible and what isn’t,” says Dr Tania Ahuja, CEO of Nobias. “Algorithms create positive feedback loops, only showing us what they think we like. As a result, we are all living in information bubbles, which polarize us against each other and keep us underinformed.”


Ahuja says Nobias’s mission is to protect readers from deceptive or misleading content, help them understand the landscape of media bias, and give them power over the algorithms that shape what they see online.

How it works

Nobias assesses articles for bias from a US perspective, as Ahuja explains: The Nobias Chrome extension assesses political slant of the source drawing from the methodology of Matthew Gentzkow and Jesse Shapiro in Econometrica (2010), which looks at congressional speeches and finds certain keywords said by a Democrat or Republican. Our machine-learning algorithm mines the text to assess a slant towards the left, right, or center.

“Nobias further adds to the source slant, used as a prior, a labeled article corpus from LexisNexis to develop article level slant using widely used machine learning techniques to identify which words are most associated with left or right leaning articles and assign those words as left and right leaning words.”

The extension ranks the credibility of both the publisher and the author. Publishers are given a score between one and five using source rankings from LexisNexis (where ‘one’ is a top national, international or business source, like The Wall Street Journal or The Economist.

Individual authors are assessed based on the credibility of their employer, and whether they have won any prestigious journalism awards (such as a Pulitzer, for example).

Ahuja says the early reaction to Nobias has been very positive, and several users appreciate being able to see articles from a different perspective. One particular user said he didn’t realize how biased his newsfeed had become.


What's next

The team are currently working on bringing Nobias to other browsers, including Firefox, Opera and Safari. They are also planning to add premium features and venture into other industries (such as finance and health news), but Ahuja insists that the core feature will remain free. 

“[…]Our customer will always remain our number one priority and the political tools we have developed, as well as the extension’s core functionality, will always be free with no advertisements,” she says.

Tools like Nobias can give you a helpful insight into the kind of news you’re reading, but they’re not a one-stop solution for bias and fake news. We asked Ahuja what else we can be doing to make sure we’re seeing the full picture.

The best way to break out of your filter bubble is to talk with people on different sides of a given issue
— Dr Tania Ahuja, Nobias

“Nobias helps you understand your online choices, but the best way to break out of your filter bubble is to talk with people on different sides of a given issue,” she says. “Using reputable news sources is another way to help ensure you are getting more balanced information about a news item. 

“We are not changing news, nor are we controlling how our customers find it or limiting what they read. We are simply enhancing their online browsing experience by providing insights on content before they consume it—to be aware of their biases and read from a variety of reliable sources, to have a holistic view rather than one contained in a bubble. Our mission is to help our customers be proactive about developing their own informed, unique point of view.”


This article has previously been published on TechRadar

Read More
Nobias Nobias

FORBES: 10 Critical Aspects To Consider Before Adopting A New Technology

New developments are always just on the cusp of being released, bringing with them a host of potential changes and improvements to how companies do business. For companies looking to stay ahead of the rest or be the first adopters of potentially game-changing systems, staying up-to-date with what new tech can offer can seem crucial.

Yet even early adopters know that just because technology becomes available does not necessarily mean that it is the right choice for a business: Different systems solve different problems. To help you decide if a new technology is right for your company, 10 members of Forbes Technology Council share some of the crucial aspects you should consider before deciding to invest your time and resources into a new technology. Here's what they said:

1. Seek Greater Clarity

Observability is a crucial aspect for a new technology. It should, in easy-to-use ways, provide greater clarity on performance status, process flow and interface health. Such technologies empower actions and decisions to improve on multiple fronts. A technology that promises to improve outcomes but that operates as a black box is likely a net negative to an organization's overall capability. - Charles Stucki, Bayware

2. Evaluate Options

We consider how the new technology will help us to meet our goals and then we evaluate whether the technology is best for our organization. We then put together the requirements: the specs, a plan, an evaluation and so forth. - Andreas Schneider, EnOcean

3. Look At The Return On Investment

The return on investment is the most crucial thing to look at. Adopting new technology is an arduous process that requires serious time dedication from the development team. Understanding the cost benefits offered by the technology and how it affects the company’s bottom line not only justifies the effort put in, but more importantly, puts concrete numbers to determine the return on investment as well. - Mihir Shinde, B&H Photo

4. Consider Cybersecurity

Cybersecurity and inside threat are the key important considerations with every new technology implementation. Not less important is the ability to maintain and operate new tech in the environment of corporate America. - Guy Caspi, Deep Instinct

5. Look To Getting Things Done

The function of technology within an organization is to find a smarter way of getting things done. I encourage my team to make time each week to keep abreast of developments so that when a business need arrives, they are equipped to make a real case for a new technology. Innovation should be driven by a use case, not the other way around. - William Francis, ENO8

6. Look At Long-Term Competitive Advantage

With daily advancements in technology, it is difficult to determine the right time to adopt new technology. It is crucial to look beyond the buzzwords and marketing and make sure that the new technology actually adds value both in the near and long-term horizon. It often takes time to incorporate new technology into mature organizations, so keep in mind the long-term competitive advantage of the technology. - Tania Ahuja, Nobias

7. Understand The Customer Better

One of the main thoughts to apply or use technology is to understand the customer requirements and needs adequately. This helps to understand any gap in the process that is implemented, which in turn helps the adoption of the right technology for the right outcome. - Satish Appalakutty, Vistalytics


8. Visualize Transformational Impact

Companies must consider the business model impact of adoption of new tech like cloud, AI, blockchain, etc. While traditional tech benefits like cost savings and productivity can be expected from tech such as AI and virtualization, new tech also stands to fundamentally change sales, marketing, service and operating models, and so must be seen for transformational impact on the company's core business. - Michael Gurau,Kaiser Associates

9. Assess The Culture Of The Vendor

Assuming that you have multiple choices, and all are cost-effective, provide tangible benefits and are well-designed solutions, decide based on the culture of the vendor. Seek a culture of innovation, customer service and continuous improvement. Seek a vendor that rejoices in small wins and has a history of continuous delivery. Find a partner that is vitally interested in your success. - Dave Bellando, 1st Global Research and Consulting

10. Ask Two Questions

We all like shiny new things that promise to help our businesses grow, increase efficiency or make employees happier. To determine if its right for you, two questions need to be asked. How does this help me protect and grow revenue? How does implementing this tech help me reduce complexity and reduce investment in my technology stack? - Stanley Lowe, Zscaler Inc.


This article has previously been featured on Forbes

Read More
Nobias Nobias

ABC 7: Nobias Empowers Users with Info Needed to Better Evaluate News Choices

NEW YORK--(BUSINESS WIRE)--Nobias, a tech startup dedicated to promoting responsible/inclusive technology, countering online manipulation and misinformation, today announced the company, its first products and vision for putting the power of informed media consumption in the hands of users.

“My main goal for founding Nobias was to promote responsible/inclusive technology that enables users to better control the news they read online”

Tweet this

Nobias’s first product is a Google Chrome extension that shows the bias and credibility of news served up by Google search results and Facebook’s news feed. Users can drill down to evaluate individual articles and authors. Giving context about the news, even before users click, can counter the effects of algorithmic curation that may keep them in a filter bubble. And knowing about the credibility can help people avoid fake news.

The company was founded by CEO Tania Ahuja, a dynamic leader who holds a PhD in finance and is an accomplished business professional with extensive knowledge in data mining and risk management. She is building experienced journalism and AI advisory teams, to help keep Nobias effective and on the cutting edge.

“My main goal for founding Nobias was to promote responsible/inclusive technology that enables users to better control the news they read online,” said Ahuja. “We want to become like a Fitbit for media consumption. It may be impossible to completely eliminate bias, but we aim to equip everyone with tools to increase awareness about the information they consume and give them control over the algorithms that shape what they read and see online.”

Nobias draws from the published methodology of Matthew Gentzkow and Jesse Shapiro in Econometrica (2010), a top economics journal, to determine political bias. This data evaluates the leaning of a news source by looking at key phrases that have been or currently are used by a Democrat or Republican. The phrases determine a lean of left, right, or center.

Nobias also utilizes editorial rating information from LexisNexis to identify credible sources, both at the website level and the author level. Sources are ranked from 1-5; a 1 rating is a top national, international, or business news source, such as the New York Times or Wall Street Journal. Authors are ranked based on the editorial strength of their employer and on industry recognition.

The Chrome extension is currently able to provide information on US political news articles from 40,000 premium and online news archives and business sources. The tool works on Google’s search engine news as well as on Facebook’s news feed. The company plans to eventually expand the extension for financial news, health news and on an international scale.

About Nobias

Nobias was founded in 2018 as a resource dedicated to promoting responsible/inclusive technology to protect consumers from deceptive or misleading content on the internet. To determine the bias and credibility of an article, it combines artificial intelligence algorithms with methodology and editorial ratings by LexisNexis. Nobias’s goal is to help people understand the landscape of media bias and to give them control over the algorithms that shape what they read and see online.

For more information, visit www.nobias.com.


This article has previously been published on ABC 7

Read More
Nobias Nobias

SILICON ALLEY: NYC Startup Nobias Helps Users Vet News Choices, Counter Misinformation and get Balanced Media “Diet”

Today, Manhattan-based Nobias announced the company and Chrome browser extension that shines a light on the bias and credibility of online news – at the publication, article and author levels.

We took the opportunity to ask Nobias CEO Tania Ahujaquestions about the company’s mission and solution:

What does Nobias do?

Nobias is a free Chrome extension that empowers users with information needed to better evaluate news choices and ensure a more balanced information “diet.” News today is so polarized and factions people into groups, leaving no room for other perspectives. People tend to lock themselves inside of their own echo chamber when it comes to news, so the dialog often gets lost. We want Nobias to burst filter bubbles, allowing everyone to develop their own unique and balanced point of view.

What was your second choice for a name?

We came up with a couple – Unbiased and Know Your Bias, but eventually ended up on Nobias. It can be split up as “No Bias” and can also be read as “No BS”, meaning you are only reading news that is in fact credible.

Do you have any user stories?

We have interviewed quite a few users and have so far received very positive feedback. One user emphasized frustration with seeing news the way it is, and how helpful it has been to see stories from a different point of view. Another thanked us for highlighting the bias and tendencies of liberal media.

How does the Chrome extension assess media credibility and bias?

The Chrome extension determines credibility and bias by not only looking at the publication but also assessing the article and the author. For credibility, we looked at the editorial strength of each publisher using LexisNexis source rank. And for authors, we looked at their employer and whether he or she has won any journalistic awards.

For bias, we drew from the published methodology of Matthew Gentzkow/Jesse Shapiro in Econometrica (2010), which utilized the Congressional Record, and identified keywords or phrases that would be considered left or right. We then trained our AI/ML algorithm to apply that to indexed content and determine slant.

Additional information on how credibility and bias is ranked can be found on the Nobias website. We are committed to offering full transparency about our methods and technology.

How do you bring users up to speed?

Most users are aware of media bias but not as aware of their own role in exacerbating the bias in their newsfeeds. We’re not changing the news out there or how people find it online. And we certainly are not trying to limit their choices. What we are doing is enhancing the browsing experience with info on news bias and credibility, at the publication and article/author levels. Users can hover over the small Nobias icons that appear in Google, Facebook, and on several news sites, and go to the toolbar should they want more information. And as always, they can check our criteria for bias and credibility on the website, as well as check their own biases.

Who are your competitors?

This is a huge market with a lot of good work still to be done. It is certainly a big problem and there have been a range of approaches and solutions. What we wanted to do is not have consumers wait for big tech to resolve it and instead put power into the hands of users.

We provide insights at the article level rather than just information on the source or site like most our competitors. Our technology is scalable and can cover many more sites, compared to companies that manually curate information. Our end-users are our customers – we will never bypass our customer and go to Facebook or Google for monetization . Lastly, unlike many that may not be forthcoming about how they determine credibility/bias, our methodology is completely transparent – everything is available on our website.

What is the business model?

We have a freemium model where our end-users will always be the main priority and for them, the core functionality on political news will stay free with no ads. The company plans on expanding and unveiling premium features in the future, providing insights for financial news and health news as well as offering a kind of “Fitbit” for a more balanced media diet.


This article has previously been published on Silicon Alley

Read More
Nobias Nobias

MARKET WATCH: Nobias Empowers Users with Info Needed to Better Evaluate News Choices

Nobias, a tech startup dedicated to promoting responsible/inclusive technology, countering online manipulation and misinformation, today announced the company, its first products and vision for putting the power of informed media consumption in the hands of users.

Nobias’s first product is a Google Chrome extension that shows the bias and credibility of news served up by Google search results and Facebook’s news feed. Users can drill down to evaluate individual articles and authors. Giving context about the news, even before users click, can counter the effects of algorithmic curation that may keep them in a filter bubble. And knowing about the credibility can help people avoid fake news.

The company was founded by CEO Tania Ahuja, a dynamic leader who holds a PhD in finance and is an accomplished business professional with extensive knowledge in data mining and risk management. She is building experienced journalism and AI advisory teams, to help keep Nobias effective and on the cutting edge.

“My main goal for founding Nobias was to promote responsible/inclusive technology that enables users to better control the news they read online,” said Ahuja. “We want to become like a Fitbit for media consumption. It may be impossible to completely eliminate bias, but we aim to equip everyone with tools to increase awareness about the information they consume and give them control over the algorithms that shape what they read and see online.”

Nobias draws from the published methodology of Matthew Gentzkow and Jesse Shapiro in Econometrica (2010), a top economics journal, to determine political bias. This data evaluates the leaning of a news source by looking at key phrases that have been or currently are used by a Democrat or Republican. The phrases determine a lean of left, right, or center.

X

See Also

What Are Stock Buybacks and Why Are There Calls to Restrict Them?

Nobias also utilizes editorial rating information from LexisNexis to identify credible sources, both at the website level and the author level. Sources are ranked from 1-5; a 1 rating is a top national, international, or business news source, such as the New York Times or Wall Street Journal. Authors are ranked based on the editorial strength of their employer and on industry recognition.

The Chrome extension is currently able to provide information on US political news articles from 40,000 premium and online news archives and business sources. The tool works on Google’s search engine news as well as on Facebook’s news feed. The company plans to eventually expand the extension for financial news, health news and on an international scale.

About Nobias

Nobias was founded in 2018 as a resource dedicated to promoting responsible/inclusive technology to protect consumers from deceptive or misleading content on the internet. To determine the bias and credibility of an article, it combines artificial intelligence algorithms with methodology and editorial ratings by LexisNexis. Nobias’s goal is to help people understand the landscape of media bias and to give them control over the algorithms that shape what they read and see online.

For more information, visit www.nobias.com.


This article has previously been featured on Market Watch

Read More
Nobias Nobias

DIGITAL JOURNAL: NewsGuard plug-in rates the Daily Mail as untrustworthy

London - The U.K.'s second biggest selling tabloid newspaper, The Daily Mail, which tends to support right wing causes, has criticized a new browser plug-in called NewsGuard for rating its website as not maintaining basic standards for news reporting.
The Daily Mail has said that the web browser alert that criticizes its journalism should be changed, according to the BBC.The NewsGuard plug-in issues a warning whenever someone with the plug-in installed browses a news article from the paper's website. The plug-in warns users that the newspaper's website "generally fails to maintain basic standards of accuracy and accountability". The text further reads that the Mail “has been forced to pay damages in numerous high-profile cases.”As well as being an optional plug-in for browsers (which people need to opt in for), Microsoft is partnering with NewsGuard to offer the NewsGuard browser extension on Microsoft Edge, and a feature in Microsoft Edge mobile apps for iOS and Android to help our customers evaluate news sourcesIn terms of relative standing, NewsGuard awards The Daily Mail's website — Mail Online, which is one of the world’s biggest news websites — one out of five on credibility. This low rating is the same level as the Kremlin-backed RT news service.In contrast, one of the Mail's rival papers, The Guardian, has been given a trustworthy rating by NewsGuard. This is not simply a matter of right-wing conservatism (The Daily Mail) versus left-of-center liberalism (The Guardian), for Fox News, which often covers similar news content to the Daily Mail, has been deemed satisfactory by NewsGuard.NewsGuard has defended its warning issue and says that it alerted The Daily Mail in August that there were concerns over bias in its reporting and with some articles that did not stand up to a scrutiny of the facts. Newsguard co-founder Steve Brill is quoted by The Drum as saying: “We spell out fairly clearly in the label exactly how many times we have attempted to contact them. The analyst that wrote this writeup got someone on the phone who, as soon he heard who she was and where she was calling from, hung up. As of now, we would love to hear if they have a complaint or if they change anything.”For more on media bias, read Digital Journal's interview with Dr. Tania Ahuja of the company Nobias. The company develops tools that can flag bias in news and grade articles for credibility.


This article has previously been published on Digital Journal

Read More
Nobias Nobias

DIGITAL JOURNAL: Q&A: What is media bias and how can this be tackled?

According to Dr. Tania Ahuja there are concerns with the state of online news, especially when it comes to bias. Ahuja has founded Nobias which has tools that can flag bias in news and grade articles for credibility.

Dr. Tania Ahuja (founder and CEO of Nobias) is a leader and an expert in online news bias and manipulation. Dr. Ahuja was a director of Citigroup and she serves on the Board of Overseers at Stern School of Business, NYU.Digital Journal caught up with Dr. Tania Ahuja to discuss the topic of media biasand the digital technologies available that can help a reader to assess the reliability of an online article.Digital Journal: How do you assess news bias? Dr. Tania Ahuja: Online news has become increasingly commercial. Rather than news itself being the product, the customer is the product - or rather his or her attention as carefully monitored by clicks and rewarded by advertising dollars. News “Netflix” models are springing up everywhere. As a result, we see more and more articles written in a way that will interest their readers; recent studies have shown that our own biases for sensational and negative news has led to more radical rather than impartial, factual news.Many journalists subconsciously infuse their articles with bias, as argumentative pieces inherently incorporate personal opinion. Nobias understands the nuances of news and is not looking to change it–at the end of the day, news is still educational and crucial to the public. Rather, we are looking to provide context so readers are aware of exactly what they consume - how biased it is relative to other articles published that week, how credible is the author and the editorial rating of the source which oversees the publishing of the article.DJ: Do you use any technology to assist with the interpretation?Ahuja: Nobias uses machine learning and natural language processing–a simple bag-of-words unigram–to determine the bias of each article you read. While others categorize political biases of sources based on subjective rating of article samples, our technology objectively and transparently provides readers insights on the current article and, importantly, allows us to assess articles not only from big publications, but local newspapers as well.DJ: How do you grade bias?Ahuja: Nobias draws from the methodology of Matthew Gentzkow and Jesse Shapiro published in Econometrica (2010), a top economics journal, to determine political bias. This data evaluates the leanings of a news source by looking at key phrases used by Democrat and Republicans in Congressional speeches. The phrases determine left, right, or neutral leaning. Nobias also utilizes editorial rating information from LexisNexis to identify credible sources, both at the website and author levels. Sources are ranked from 1-5; a 1 rating is a top national, international, or business news source, such as the New York Times or Wall Street Journal. Authors are ranked based on their employer rank and on whether they have won journalistic awards.DJ: What types of biases are the most common?Ahuja: At the moment, we have placed political bias at the forefront of our focus. After perfecting our beta extension focused on solely providing information on U.S. political bias and author credibility, we plan to expand into gender and financial news biases as well as to international news.DJ: Are these biases deliberate or the result of institutional factors?Ahuja: While some online news sources intentionally provide a partisan perspective to advance their agenda, most sources propagate biases that are the result of institutional factors. As individual bias is often the result of subconscious understanding of the covert hierarchy in one’s environment, biases prevalent in the news reflect the journalist’s perspective applied to current events. The bias in the newsfeed is further exacerbated by the readers’ own unconscious bias if news “Netflix” models are in use.DJ: What is the impact of these in terms of public perception?Ahuja: Alternative facts and heavily biased news are providing each person with a different reality. Someone in the Midwest is not going to understand the world the same way someone from the Northeast would–this is not to say that that differing opinions are bad; differing facts are. We are already at a point where basic facts such as Obama’s birthplace, global warming, and the existence of HIV/AIDS, are disputed.DJ: Are some news agencies worse than others for bias?Ahuja: We believe that political bias is a spectrum and therefore while some sources are more biased than others, even a credible source might have occasional strongly opinionated articles. Most do contain op-ed sections and other columns where bias often lives and is more acceptable.We label sources (Left, Center-Left, Center, Center-Right, Right) based on the median leaning of their articles and we provide a 5-scale article rating (Reliably-Left, Likely Left, Center, Likely Right, Reliably-Right) to provide perspective on more or less moderate biases in various sources.DJ: Is biased news and fake news different?Ahuja: Yes, they are different. Nobias’s beta version has two separate methodologies for characterizing fake news (i.e. our credibility rating) and biased news (i.e. our political slant rating). We see fake news as providing “alternative facts” or blatant inaccuracies. In contrast, while biased news might not provide false truths, it might offer the opinions of one side more readily than the other, or omit the other perspective entirely.While credible or fake news is more binary–dealing with facts vs. fiction, neutral or biased news deals with opinion and perspective. Nobias uses editorial ratings generated by LexisNexis to determine credibility. These editorial ranks are applied to news outlets only. It is a source-level categorization indicating LexisNexis editorial ranking of the source. Author’s credibility draws on the strength of the editorial ranking, where they work and whether they have won journalistic awards. Political slant is determined entirely independently using the Gentzkow methodology.DJ: What are the best sources of news, in terms of being bias free?Ahuja: Completely bias-free news is hard to come by as most authors are likely biased. Nobias is not pushing for completely unbiased news; at the end of the day, people want their news to interest them–which, for better or worse, cannot be plain facts with no spin. We just want readers to recognize their biases and approach news cognizant of them. There are a few sources that are Nobias-certified centrist, on the other hand, that apply a lot of effort to remain impartial in their reporting. These include BBC News, Bloomberg, Reuters, The Hill, USA Today, and the Wall Street Journal.DJ: Who does your company provide services for?Ahuja: Nobias provides services for politically conscious individuals looking to burst their filter bubble and gain a broader perspective of the news they receive.


This article has previously been published on Digital Journal

Read More
Nobias Nobias

MASHABLE: 13 Ways To Save Your Thanksgiving

Happy Thanksgiving, now you'll be forced to spend time with your family. 

While you may be excited by the prospect of enjoying some quality time with your family, you may also have some concerns and reservations about dinner. There are — as we have all learned this year — so many ways that things can quickly shift from joyous to horrifying. 

But don't worry, we've come up with a couple of unique and effective ways for you to shift the energy at your Thanksgiving dinner table, should things take a turn for the worst. From playing with pets to handy Alexa skills, we've got you covered this holiday season. Oh, and, feel free to reuse these ideas come Christmas as well. 


Here are the 13 best ways to get out of your most uncomfortable Thanksgiving dinner moments:

1. Call upon Alexa for back up

Things just got super weird, and your family has an Echo or Echo Dot? Call out to Alexa for help. No, seriously.

Alexa now has over 20,000 skills available for almost anyone to use. We recommend asking: 

  • Alexa, play Beer Goggles

  • Alexa, launch This Day In History 

  • Alexa, play Jeopardy

You can also check out this handy article to see the 60 most useful Alexa's abilities, for even more skills and commands to call out during dinner.

2. Change the subject 

This advice is obviously easier said than done, especially when conversations get heated and emotional. But, there are ways to effectively transition from difficult talks to slightly less trying ones!

Real Simple has a really smart and easy to use guide on changing the subject and repositioning conversations — invaluable insights for awkward family gatherings.

3. Keep football on in the background

Even if you're not super into football, keeping the game on in the background can serve as a welcome distraction at tenser moments during the evening and it gives everyone something to talk about.

Even if you know nothing about football, you can comment on the commercials, make fun of referees, or just pick a team to root for to make the time pass by faster.

4. Play with pets

Pets are meant to be played with — and this is even more true during a difficult time with family. Beckon pets over to you for cuddles and kisses, use a turkey bribe if you must, so that you can enjoy their comforting presence.

5. Discuss the book you read last

If you're afraid of accidentally bringing up a touchy subject, instead bring up a book or article you read last that you found to be super interesting — and invite others to do the same. 

As long as the last books you or your family read aren't super controversial this is bound to be a good topic of conversation.

6. Play Bop It! 

Take a page from the Gilmore Girls, and break out the Bop It! when things get dire. 

Bop It! is quite possibly one of the easiest games to play since there's no set up, no board, all you have to do is team up with someone to play it. And the game is goofy enough that you'll all find yourselves laughing at how impossible it is to beat.

7. Call or DM Turkey hot lines

For years, Butterball has catered to less intuitive home cooks by keeping its "Turkey Talk Line" open for any and all turkey-related questions. But this year, they've gone digital.

You can now tweet, dm, or ask Alexa for help with all of your turkey cooking needs. (And yes, you can still call them.)

If you've run out of things to talk about, or just want to switch topics, tweeting at or asking Alexa some ridiculous turkey questions could be just what you and your family needs.

8. Invite a "buffer" over for dinner

It might seem cruel to invite someone to Thanksgiving in the hopes that they'll create a buffer for you and your weirdo family, but in reality this tends to make the meal much more enjoyable for everyone.

If you know someone who doesn't have anywhere to go this holiday, or someone that's spending their first holiday in the states, why not invite them over? You and your family will spend time asking your buffer about what they do, their family traditions, and they'll be so distracted that they'll forget to ask you if you're seeing anyone special this year.

9. Bring some Play-Doh along

Another valuable tool in the Thanksgiving dinner arsenal is play dough! If there are kids at your gathering, they're sure to get a kick out of playing with Play-Doh, and adults will have fun revisiting this childhood toy.

It may seem kind of goofy, but your guests will definitely enjoy and welcome this distraction. 

You can buy a pack on Amazon, or if you're feeling overly ambitious this year, you can make your own fall-scented dough with the help of this recipe — just don't let anyone eat it!

10. Read off some funny tweets

Make a list on Twitter of all the funniest people you follow, and refresh it to keep yourself amused throughout the evening. 

And, if you're wondering who to follow, we recommend, I've Pet that Dog, for some extremely wholesome content, any of these celebrity trolls for hilarious commentary, and the always amusing comedian Rob Delaney. Alternatively, you can always just look back on some of the best tweets of the year.

11. Take a break

There's no hard and fast rule that says you're never allowed to get up from the dinner table. If you need a break, take a break. Walk around the block, chill out in the bathroom, or head anywhere else you like for a little reprieve. 

You're an adult! You can do whatever you want — with a few minor caveats. 

If things aren't going great, do what you need to make yourself more comfortable. Sure, dinners can get tense, but you shouldn't feel like you're being tortured.

12. Use telemarketers to break up fights

Telemarketers Save Thanksgiving is a website that allows you to send a call to whatever phone you want, that will play a selection a soothing recording of your choosing.

You can choose to have the National Anthem, calming ambient sounds, and positive affirmations, among other non-antagonizing messages play when the telephone is answered. 

13. Eat through the pain

Look, we've all gathered for the food — and to be thankful, or whatever — so why not fully lean into the dinner.

Savor the smorgasbord of food at your table, compliment the dishes you like, and give thanks to whomever your host is. Everyone loves food, and almost everyone loves to talk about it, so why not make it your mission to discuss nothing but this year.


This article has previously been featured on Mashable

Read More