QAZ     ENG     RUS

Methodology

For this study, we identified and selected the most popular digital platforms in Kazakhstan. We considered not only the companies themselves, but also their individual products and services, including in the sphere of mobile ecosystems. 

A digital platform is an integrated software and hardware solution that provides digital infrastructure and functionality for the interaction of various actors, applications and data in a digital ecosystem

To investigate public standpoints and policies of the companies on human rights compliance, we used their official websites as well as web resources of parent companies/corporate groups. 

The issues under research were grouped according to three indicators.

Corporate Governance 

The indicators in this category are intended to demonstrate that the company has governance processes in place that honor human rights to freedom of expression and privacy. For a company to perform well in this category, its business disclosures should, as a minimum, reflect and preferably exceed the UN Guiding Principles on Business and Human Rights and other human rights standards on freedom of expression and privacy adopted by the Global Network Initiative.

G-indicators

F. Freedom of Expression and Information

Indicators in this category help in determining if the company has demonstrated respect for the rights
to freedom of expression and information in accordance with international human rights standards. The company's published policies and practices clearly demonstrate what actions are taken to address human
rights abuses, unless such actions are lawful, proportionate and for a justifiable purpose. Companies that perform well on this indicator show their commitment to the principle of openness not only in how they respond to demands from the government and other stakeholders, but also in how they establish, explain and comply with their own business rules and principles that affect users' fundamental right to freedom of expression and information.

F-indicators

P. Privacy

Indicators in this category reflect that companies strive to communicate their commitment to users' right
to privacy in an accessible way, through examples of their policies and practices, in line with international
human rights standards. Open corporate policies and practices demonstrate how companies are careful
not to facilitate actions that may violate users' privacy, unless such actions are lawful, proportionate and for
a justifiable purpose. They also demonstrate their strong dedication to protecting and safeguarding users'
digital security. Companies that perform well on these indicators demonstrate their steadfast commitment
to transparency not only in how they respond to the needs of the state authorities and other players, but
also in how they define, communicate, and enforce their own policies and industry practices that affect
user privacy.

P-indicators

The research process consisted of the following stages:

1. Compiling an inventory of publicly available documents of each service by the first expert;
2. Scoring each indicator by the first expert;
3. Validating obtained results by the second and third experts;
4. The process of interaction with the companies under consideration and their digital platforms, called “Company Engagement”;
5. Performing the "horizontal verification": comparing the companies' results with each other in order to maintain a unified and objective approach and coordinating the final scores with the Ranking Digital Rights team.

Each indicator has a list of parameters, and companies receive a score (full, partial or zero) for each parameter met. The score takes into account the degree of disclosure for each indicator parameter based on one of the following possible answers:

"Yes" (full disclosure): the information disclosure complies with the indicator requirements.

"Partial": the company has disclosed some but not all aspects of the indicator, or the disclosure is not complete enough to meet all the requirements of the indicator.

"No data on disclosure": researchers could not find information on the company's website that answers the element's question.

"No": information exists, but it does not specifically disclose the subject matter of the query on the parameter. This option is different from "No disclosure found," although both do not score favorably.

"Not Applicable": the element is not relevant to the company or service. Items marked as "Not Applicable" will not be counted in the scoring for or against the parameter.

Points

Yes/full disclosure = 100
• Partial disclosure = 50
• No disclosure = 0
• No data on the disclosure = 0
• Not applicable - data are not included in scoring and averaging.

Detailed disclosure of research indicators:

G1. Policy Commitment

Elements:

  1. Does the company make an explicit, clearly articulated policy commitment to human rights, including to freedom of expression and information?
  2. Does the company make an explicit, clearly articulated policy commitment to human rights, including to privacy?
  3. Does the company disclose an explicit, clearly articulated policy commitment to human rights in its development and use of algorithmic systems?

 

G2. Governance and management oversight

Elements:

  1. Does the company clearly disclose that the board of directors exercises formal oversight over how company practices affect freedom of expression and information?
  2. Does the company clearly disclose that the board of directors exercises formal oversight over how company practices affect privacy?
  3. Does the company clearly disclose that an executive-level committee, team, program or officer oversees how company practices affect freedom of expression and information?
  4. Does the company clearly disclose that an executive-level committee, team, program or officer oversees how company practices affect privacy?
  5. Does the company clearly disclose that a management-level committee, team, program or officer oversees how company practices affect freedom of expression and information?
  6. Does the company clearly disclose that a management-level committee, team, program or officer oversees how company practices affect privacy?

 

G3. Internal implementation

Elements:

  1. Does the company clearly disclose that it provides employee training on freedom of expression and information issues?
  2. Does the company clearly disclose that it provides employee training on privacy issues?
  3. Does the company clearly disclose that it maintains an employee whistleblower program through which employees can report concerns related to how the company treats its users’ freedom of expression and information rights?
  4. Does the company clearly disclose that it maintains an employee whistleblower program through which employees can report concerns related to how the company treats its users’ privacy rights?

 

G4(b). Impact assessment: Processes for policy enforcement

Elements:

  1. Does the company assess freedom of expression and information risks of enforcing its terms of service?
  2. Does the company conduct risk assessments of its enforcement of its privacy policies?
  3. Does the company assess discrimination risks associated with its processes for enforcing its terms of service?
  4. Does the company assess discrimination risks associated with its processes for enforcing its privacy policies?
  5. Does the company conduct additional evaluation whenever the company’s risk assessments identify concerns?
  6. Do senior executives and/or members of the company’s board of directors review and consider the results of assessments and due diligence in their decision-making?
  7. Does the company conduct assessments on a regular schedule?
  8. Are the company’s assessments assured by an external third party?
  9. Is the external third party that assures the assessments accredited to a relevant and reputable human rights standard by a credible organization?

 

G4(c) Impact assessment: Targeted advertising

Elements:

  1. Does the company assess freedom of expression and information risks associated with its targeted advertising policies and practices?
  2. Does the company assess privacy risks associated with its targeted advertising policies and practices?
  3. Does the company assess discrimination risks associated with its targeted advertisingpolicies and practices?
  4. Does the company conduct additional evaluation whenever the company’s risk assessments identify concerns?
  5. Do senior executives and/or members of the company’s board of directors review and consider the results of assessments and due diligence in their decision-making?
  6. Does the company conduct an assessments on a regular schedule?
  7. Are the company’s assessments assured by an external third party?
  8. Is the external third party that assures the assessment accredited to a relevant and reputable human rights standard by a credible organization?

 

G4(d). Impact assessment: Algorithmic systems

Elements:

  1. Does the company assess freedom of expression and information risks associated with its development and use of algorithmic systems?
  2. Does the company assess privacy risks associated with its development and use of algorithmic systems?
  3. Does the company assess discrimination risks associated with its development and use of algorithmic systems?
  4. Does the company conduct additional evaluation whenever the company’s risk assessments identify concerns?
  5. Do senior executives and/or members of the company’s board of directors review and consider the results of assessments and due diligence in their decision-making?
  6. Does the company conduct assessments on a regular schedule?
  7. Are the company’s assessments assured by an external third party?
  8. Is the external third party that assures the assessment accredited to a relevant and reputable human rights standard by a credible organization?

 

G4(a). Impact assessment: Governments and regulations

Elements:

  1. Does the company assess how laws affect freedom of expression and information in jurisdictions where it operates?
  2. Does the company assess how laws affect privacy in jurisdictions where it operates?
  3. Does the company assess freedom of expression and information risks associated with existing products and services in jurisdictions where it operates?
  4. Does the company assess privacy risks associated with existing products and services in jurisdictions where it operates?
  5. Does the company assess freedom of expression and information risks associated with a new activity, including the launch and/or acquisition of new products, services, or companies, or entry into new markets or jurisdictions?
  6. Does the company assess privacy risks associated with a new activity, including the launch and/or acquisition of new products, services, or companies, or entry into new markets or jurisdictions?
  7. Does the company conduct additional evaluation whenever the company’s risk assessments identify concerns?
  8. Do senior executives and/or members of the company’s board of directors review and consider the results of assessments and due diligence in their decision-making?
  9. Does the company conduct assessments on a regular schedule?
  10. Are the company’s assessments assured by an external third party?
  11. Is the independent third-party organization providing the assessment a credible organization accredited to an appropriate authoritative human rights standard?

F1(a). Access to terms of service

Elements:

  1. Are the company’s terms of service easy to find?
  2. Are the terms of service available in the primary language(s) spoken by users in the company’s home jurisdiction?
  3. Are the terms of service presented in an understandable manner?

 

F1(b). Access to advertising content policies

Elements:

  1. Are the company’s advertising content policies easy to find?
  2. Are the company’s advertising content policies available in the primary language(s) spoken by users in the company’s home jurisdiction?
  3. Are the company’s advertising content policies presented in an understandable manner?
  4. (For mobile ecosystems): Does the company clearly disclose that it requires apps made available through its app store to provide users with an advertising content policy?
  5. (For personal digital assistant ecosystems): Does the company clearly disclose that it requires skills made available through its skill store to provide users with an advertising content policy?

 

F1(c). Access to advertising targeting policies

Elements:

  1. Are the company’s advertising targeting policies easy to find?
  2. Are the advertising targeting policies available in the primary language(s) spoken by users in the company’s home jurisdiction?
  3. Are the advertising targeting policies presented in an understandable manner?
  4. (For mobile ecosystems): Does the company clearly disclose that it requires apps made available through its app store to provide users with an advertising targeting policy?
  5. (For personal digital assistant ecosystems): Does the company clearly disclose that it requires skills made available through its skill store to provide users with an advertising targeting policy?

 

F1(d). Access to algorithmic system use policies

Elements:

  1. Are the company’s algorithmic system use policies easy to find?
  2. Are the algorithmic system use policies available in the primary language(s) spoken by users in the company’s home jurisdiction?
  3. Are the algorithmic system use policies presented in an understandable manner?

 

F3(a). Process for terms of service enforcement

Elements:

  1. Does the company clearly disclose what types of content or activities it does not permit?
  2. Does the company clearly disclose why it may restrict a user’s account?
  3. Does the company clearly disclose information about the processes it uses to identify content or accounts that violate the company’s rules?
  4. Does the company clearly disclose how it uses algorithmic systems to flag content that might violate the company’s rules?
  5. Does the company clearly disclose whether any government authorities receive priority consideration when flagging content to be restricted for violating the company’s rules?
  6. Does the company clearly disclose whether any private entities receive priority consideration when flagging content to be restricted for violating the company’s rules?
  7. Does the company clearly disclose its process for enforcing its rules once violations are detected?

 

F6. Data about government demands to restrict for content and accounts

Elements:

  1. Does the company break out the number of government demands it receives by country?
  2. Does the company list the number of accounts affected?
  3. Does the company list the number of pieces of content or URLs affected?
  4. Does the company list the types of subject matter associated with the government demands it receives?
  5. Does the company list the number of government demands that come from different legal authorities?
  6. Does the company list the number of government demands it knowingly receives from government officials to restrict content or accounts through unofficial processes?
  7. Does the company list the number of government demands with which it complied?
  8. Does the company publish the original government demands or disclose that it provides copies to a public third-party archive?
  9. Does the company report this data at least once a year?

 

F7. Data about private requests for content or account restriction

Elements:

  1. Does the company break out the number of requests to restrict content or accounts that it receives through private processes?
  2. Does the company list the number of accounts affected?
  3. Does the company list the number of pieces of content or URLs affected?
  4. Does the company list the reasons for removal associated with the requests it receives?
  5. Does the company clearly disclose the private processes that made requests?
  6. Does the company list the number of requests it complied with?
  7. Does the company publish the original requests or disclose that it provides copies to a public third-party archive?
  8. Does the company report this data at least once a year?
  9. Can the data be exported as a structured data file?
  10. Does the company clearly disclose that its reporting covers all types of requests that it receives through private processes?

F11. Identity policy

  1. Does the company require users to verify their identity with their government-issued identification, or with other forms of identification that could be connected to their offline identity?

F12. Algorithmic content curation, recommendation, and/or ranking systems

Elements:

  1. Does the company clearly disclose whether it uses algorithmic systems to curate, recommend, and/or rank the content that users can access through its platform?
  2. Does the company clearly disclosee how the algorithmic systems are deployed to curate, recommend, and/or rank content, including the variables that influence these systems?
  3. Does the company clearly disclose what options users have to control the variables that the algorithmic content curation, recommendation, and/or ranking system takes into account?
  4. Does the company clearly disclose whether algorithmic systems are used to automatically curate, recommend, and/or rank content by default?
  5. Does the company clearly disclose that users can opt in to automated content curation, recommendation, and/or ranking systems?

P1(a). Access to privacy policies

Elements:

  1. Are the company’s privacy policies easy to find?
  2. Are the privacy policies available in the primary language(s) spoken by users in the company’s home jurisdiction?
  3. Are the policies presented in an understandable manner?
  4. (For mobile ecosystems): Does the company disclose that it requires apps made available through its app store to provide users with a privacy policy?
  5. (For personal digital assistant ecosystems): Does the company disclose that it requires skills made available through its skill store to provide users with a privacy policy?

 

P2(a). Changes to privacy policies

Elements:

  1. Does the company clearly disclose that it directly
  2. notifies users about all changes to its privacy policies?
  3. Does the company clearly disclose how it will directly notify users of changes?
  4. Does the company clearly disclose the timeframe within which it directly notifiesusers of changes prior to these changes coming into effect?
  5. Does the company maintain a public archive or change log?
  6. (For mobile ecosystems): Does the company clearly disclose that it requires apps sold through its app store to notify users when the app changes its privacy policy?

 

P3(a). Collection of user information

Elements:

  1. Does the company clearly disclose what types of user information it collects?
  2. For each type of user information the company collects, does the company clearly disclose how it collects that user information?
  3. Does the company clearly disclose that it limits collection of user information to what is directly relevant and necessary to accomplish the purpose of its service?
  4. (For mobile ecosystems): Does the company clearly disclose that it evaluates whether the privacy policies of third-party appsmade available through its app store disclose what user information the apps collect?
  5. (For mobile ecosystems): Does the company clearly disclose that it evaluates whether third-party apps made available through its app store limit collection of user information to what is directly relevant and necessary to accomplish the purpose of the app?
  6. (For personal digital assistant ecosystems): Does the company clearly disclose that it evaluates whether the privacy policies of third-party skills made available through its skill store disclose what user information the skills collect?
  7. (For personal digital assistant ecosystems): Does the company clearly disclose that it evaluates whether third-party skills made available through its skill store limit collection of user information to what is directly relevant and necessary to accomplish the purpose of the skill?

 

P4. Sharing of user information

Elements:

  1. For each type of user information the company collects, does the company
  2. clearly disclose whether it shares that user information?
  3. For each type of user information the company shares, does the company clearly disclose the types of third parties with which it shares that user information?
  4. Does the company clearly disclose that it may share
  5. user information with government(s) or legal authorities?
  6. For each type of user information the company shares, does the company clearly disclose the names of all third parties with which it shares user information?

 

P5. Purpose for collecting, inferring, and sharing user information

Elements:

  1. For each type of user information the company collects, does the company clearly disclose
  2. its purpose for collection?
  3. For each type of user information the company infers, does the company clearly disclose its purpose for the inference?
  4. Does the company clearly disclose whether it combines user information from various company services and if so, why?
  5. For each type of user information the company shares, does the company clearly disclose its purpose for sharing?
  6. Does the company clearly disclose that it limits its use of user information to the purpose for which it was collected or inferred?

 

P6. Retention of user information

Elements:

  1. For each type of user information the company collects, does the company
  2. clearly disclose how long it retains that user information?
  3. Does the company clearly disclose what de-identified user information it retains?
  4. Does the company clearly disclose the process for de-identifying user information?
  5. Does the company clearly disclose that it deletes all user information after users terminate their account?
  6. Does the company clearly disclose the time frame in which it will delete
  7. user information after users terminate their account?

 

P7. Users’ control over their own user information

Elements:

  1. For each type of user information the company collects, does the company clearly disclose
  2. whether users can control the company’s collection of this user information?
  3. For each type of user information the company collects, does the company clearly disclose
  4. whether users can delete this user information?
  5. For each type of user information the company infers on the basis of collected information,
  6. does the company clearly disclose whether users can control if the company can attempt to infer this user information?
  7. For each type of user information the company infers on the basis of collected information,
  8. does the company clearly disclose whether users can delete this user information?
  9. Does the company clearly disclose that it provides users with options to control how their user information is used for targeted advertising?

 

P8. Users’ access to their own user information

Elements:

  1. Does the company clearly disclose that users can obtain a copy of their
  2. user information?
  3. Does the company clearly disclose what user information users can obtain?
  4. Does the company clearly disclose that users can obtain their user information in a structured data format?
  5. Does the company clearly disclose that users can obtain all public-facing and private user information a company holds about them?
  6. Does the company clearly disclose that users can access the list of advertising audience categories to which the company has assigned them?
  7. Does the company clearly disclose that users can obtain all the information that a company has inferred about them?

 

P9. Collection of user information from third parties

Elements:

  1. (For digital platforms) Does the company clearly disclose what user information it collects from third-party websites through technical means?
  2. (For digital platforms) Does the company clearly explain how it collects user information from third parties through technical means?
  3. (For digital platforms) Does the company clearly disclose its purpose for collecting user information from third parties through technical means?
  4. (For digital platforms) Does the company clearly disclose how long it retains the user information it collects from third parties through technical means?
  5. (For digital platforms) Does the company clearly disclose that it respects user-generated signals to opt out of data collection?
  6. Does the company clearly disclose what user information it collects from third parties through non-technical means?
  7. Does the company clearly disclose how it collects user information from third parties through non-technical means?
  8. Does the company clearly disclose its purpose for collecting user information from third parties through
  9. non-technical means?
  10. Does the company clearly disclose how long it retains the user information it collects from third parties through non-technical means?

 

P10(a). Process for responding to government demands for user information

Elements:

  1. Does the company clearly disclose its process for responding to non-judicial government demands?
  2. Does the company clearly disclose its process for responding to court orders?
  3. Does the company clearly disclose its process for responding to government demands from foreign jurisdictions?
  4. Do the company’s explanations clearly disclose the legal basis under which it may comply with government demands?
  5. Does the company clearly disclose that it carries out due diligence on government demands before deciding how to respond?
  6. Does the company commit to push back on inappropriate or overbroad government demands?
  7. Does the company provide clear guidance or examples of implementation of its process for government demands?

 

P10(b). Process for responding to private requests for user information

Elements:

  1. Does the company clearly disclose its process for responding to requests made through private processes?
  2. Do the company’s explanations clearly disclose the basis under which it may comply with requests made through private processes?
  3. Does the company clearly disclose that it carries out due diligence on requests made through private processes before deciding how to respond?
  4. Does the company commit to push back on inappropriate or overbroad requests made through private processes?
  5. Does the company provide clear guidance or examples of implementation of its process of responding to requests made through private processes?

 

P12. User notification about third-party requests for user information

Elements:

  1. Does the company clearly disclose that it notifies users when government entities (including courts or other judicial bodies) demand their user information?
  2. Does the company clearly disclose that it notifies users when they receive requests for their user information through private processes?
  3. Does the company clearly disclose situations when it might not notify users, including a description of the types of government demands it is prohibited by law from disclosing to users?

 

P15. Data breaches

Elements:

  1. Does the company clearly disclose that it will notify the relevant authorities without undue delay when a data breach occurs?
  2. Does the company clearly disclose its process for notifying data subjects who might be affected by a data breach?
  3. Does the company clearly disclose what kinds of steps it will take to address the impact of a data breach on its users?

 

P17. Account security (digital platforms)

Elements:

  1. Does the company clearly disclose that it deploys advanced authentication methods to prevent fraudulent access?
  2. Does the company clearly disclose that users can view their recent account activity?
  3. Does the company clearly disclose that it notifies users about unusual account activity and possible unauthorized access to their accounts?

 

P18. Inform and educate users about potential risks

Elements:

  1. Does the company publish practical materials that educate users on how to protect themselves from cybersecurity risks relevant to their products or services?